Feb 25 15:45:47 crc systemd[1]: Starting Kubernetes Kubelet... Feb 25 15:45:47 crc restorecon[4681]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:47 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 15:45:48 crc restorecon[4681]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 15:45:48 crc restorecon[4681]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 25 15:45:50 crc kubenswrapper[4937]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 25 15:45:50 crc kubenswrapper[4937]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 25 15:45:50 crc kubenswrapper[4937]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 25 15:45:50 crc kubenswrapper[4937]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 25 15:45:50 crc kubenswrapper[4937]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 25 15:45:50 crc kubenswrapper[4937]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.118935 4937 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123627 4937 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123651 4937 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123658 4937 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123663 4937 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123668 4937 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123676 4937 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123681 4937 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123687 4937 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123693 4937 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123699 4937 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123705 4937 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123712 4937 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123717 4937 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123723 4937 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123729 4937 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123733 4937 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123738 4937 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123742 4937 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123749 4937 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123756 4937 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123771 4937 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123777 4937 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123784 4937 feature_gate.go:330] unrecognized feature gate: Example Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123789 4937 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123794 4937 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123799 4937 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123804 4937 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123809 4937 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123813 4937 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123818 4937 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123824 4937 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123828 4937 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123833 4937 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123837 4937 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123843 4937 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123848 4937 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123853 4937 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123858 4937 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123863 4937 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123868 4937 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123872 4937 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123877 4937 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123883 4937 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123890 4937 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123898 4937 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123903 4937 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123907 4937 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123914 4937 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123919 4937 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123924 4937 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123928 4937 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123933 4937 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123937 4937 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123941 4937 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123946 4937 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123950 4937 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123955 4937 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123959 4937 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123963 4937 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123967 4937 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123972 4937 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123976 4937 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123982 4937 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123987 4937 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123991 4937 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.123995 4937 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.124000 4937 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.124004 4937 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.124009 4937 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.124013 4937 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.124017 4937 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124138 4937 flags.go:64] FLAG: --address="0.0.0.0" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124152 4937 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124161 4937 flags.go:64] FLAG: --anonymous-auth="true" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124170 4937 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124178 4937 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124185 4937 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124193 4937 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124201 4937 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124207 4937 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124212 4937 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124219 4937 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124225 4937 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124230 4937 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124235 4937 flags.go:64] FLAG: --cgroup-root="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124241 4937 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124248 4937 flags.go:64] FLAG: --client-ca-file="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124254 4937 flags.go:64] FLAG: --cloud-config="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124259 4937 flags.go:64] FLAG: --cloud-provider="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124263 4937 flags.go:64] FLAG: --cluster-dns="[]" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124270 4937 flags.go:64] FLAG: --cluster-domain="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124275 4937 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124280 4937 flags.go:64] FLAG: --config-dir="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124286 4937 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124291 4937 flags.go:64] FLAG: --container-log-max-files="5" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124300 4937 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124306 4937 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124311 4937 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124317 4937 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124323 4937 flags.go:64] FLAG: --contention-profiling="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124328 4937 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124333 4937 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124338 4937 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124345 4937 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124352 4937 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124357 4937 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124362 4937 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124367 4937 flags.go:64] FLAG: --enable-load-reader="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124373 4937 flags.go:64] FLAG: --enable-server="true" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124380 4937 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124388 4937 flags.go:64] FLAG: --event-burst="100" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124393 4937 flags.go:64] FLAG: --event-qps="50" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124398 4937 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124404 4937 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124409 4937 flags.go:64] FLAG: --eviction-hard="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124415 4937 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124420 4937 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124426 4937 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124431 4937 flags.go:64] FLAG: --eviction-soft="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124455 4937 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124461 4937 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124466 4937 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124471 4937 flags.go:64] FLAG: --experimental-mounter-path="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124476 4937 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124496 4937 flags.go:64] FLAG: --fail-swap-on="true" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124503 4937 flags.go:64] FLAG: --feature-gates="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124510 4937 flags.go:64] FLAG: --file-check-frequency="20s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124515 4937 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124520 4937 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124526 4937 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124531 4937 flags.go:64] FLAG: --healthz-port="10248" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124536 4937 flags.go:64] FLAG: --help="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124551 4937 flags.go:64] FLAG: --hostname-override="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124556 4937 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124562 4937 flags.go:64] FLAG: --http-check-frequency="20s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124567 4937 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124573 4937 flags.go:64] FLAG: --image-credential-provider-config="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124578 4937 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124583 4937 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124589 4937 flags.go:64] FLAG: --image-service-endpoint="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124595 4937 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124600 4937 flags.go:64] FLAG: --kube-api-burst="100" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124606 4937 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124611 4937 flags.go:64] FLAG: --kube-api-qps="50" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124617 4937 flags.go:64] FLAG: --kube-reserved="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124622 4937 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124627 4937 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124632 4937 flags.go:64] FLAG: --kubelet-cgroups="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124637 4937 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124642 4937 flags.go:64] FLAG: --lock-file="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124648 4937 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124653 4937 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124659 4937 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124667 4937 flags.go:64] FLAG: --log-json-split-stream="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124672 4937 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124677 4937 flags.go:64] FLAG: --log-text-split-stream="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124682 4937 flags.go:64] FLAG: --logging-format="text" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124688 4937 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124694 4937 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124699 4937 flags.go:64] FLAG: --manifest-url="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124704 4937 flags.go:64] FLAG: --manifest-url-header="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124711 4937 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124716 4937 flags.go:64] FLAG: --max-open-files="1000000" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124723 4937 flags.go:64] FLAG: --max-pods="110" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124728 4937 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124734 4937 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124739 4937 flags.go:64] FLAG: --memory-manager-policy="None" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124745 4937 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124750 4937 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124756 4937 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124762 4937 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124775 4937 flags.go:64] FLAG: --node-status-max-images="50" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124780 4937 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124786 4937 flags.go:64] FLAG: --oom-score-adj="-999" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124792 4937 flags.go:64] FLAG: --pod-cidr="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124799 4937 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124808 4937 flags.go:64] FLAG: --pod-manifest-path="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124813 4937 flags.go:64] FLAG: --pod-max-pids="-1" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124819 4937 flags.go:64] FLAG: --pods-per-core="0" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124824 4937 flags.go:64] FLAG: --port="10250" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124829 4937 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124834 4937 flags.go:64] FLAG: --provider-id="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124839 4937 flags.go:64] FLAG: --qos-reserved="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124845 4937 flags.go:64] FLAG: --read-only-port="10255" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124850 4937 flags.go:64] FLAG: --register-node="true" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124855 4937 flags.go:64] FLAG: --register-schedulable="true" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124860 4937 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124869 4937 flags.go:64] FLAG: --registry-burst="10" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124874 4937 flags.go:64] FLAG: --registry-qps="5" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124880 4937 flags.go:64] FLAG: --reserved-cpus="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124885 4937 flags.go:64] FLAG: --reserved-memory="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124892 4937 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124897 4937 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124904 4937 flags.go:64] FLAG: --rotate-certificates="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124910 4937 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124916 4937 flags.go:64] FLAG: --runonce="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124922 4937 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124928 4937 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124934 4937 flags.go:64] FLAG: --seccomp-default="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124940 4937 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124946 4937 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124953 4937 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124959 4937 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124966 4937 flags.go:64] FLAG: --storage-driver-password="root" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124972 4937 flags.go:64] FLAG: --storage-driver-secure="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124978 4937 flags.go:64] FLAG: --storage-driver-table="stats" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124983 4937 flags.go:64] FLAG: --storage-driver-user="root" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124989 4937 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.124994 4937 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125000 4937 flags.go:64] FLAG: --system-cgroups="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125005 4937 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125016 4937 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125021 4937 flags.go:64] FLAG: --tls-cert-file="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125026 4937 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125033 4937 flags.go:64] FLAG: --tls-min-version="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125038 4937 flags.go:64] FLAG: --tls-private-key-file="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125043 4937 flags.go:64] FLAG: --topology-manager-policy="none" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125048 4937 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125053 4937 flags.go:64] FLAG: --topology-manager-scope="container" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125058 4937 flags.go:64] FLAG: --v="2" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125065 4937 flags.go:64] FLAG: --version="false" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125072 4937 flags.go:64] FLAG: --vmodule="" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125078 4937 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125084 4937 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125210 4937 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125217 4937 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125223 4937 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125228 4937 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125233 4937 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125238 4937 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125243 4937 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125247 4937 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125251 4937 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125256 4937 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125260 4937 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125266 4937 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125271 4937 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125275 4937 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125279 4937 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125284 4937 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125288 4937 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125294 4937 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125300 4937 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125305 4937 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125310 4937 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125315 4937 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125320 4937 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125326 4937 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125330 4937 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125335 4937 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125340 4937 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125346 4937 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125351 4937 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125356 4937 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125362 4937 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125366 4937 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125371 4937 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125376 4937 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125381 4937 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125386 4937 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125390 4937 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125395 4937 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125400 4937 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125404 4937 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125408 4937 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125414 4937 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125420 4937 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125433 4937 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125438 4937 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125442 4937 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125447 4937 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125451 4937 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125456 4937 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125460 4937 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125464 4937 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125469 4937 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125473 4937 feature_gate.go:330] unrecognized feature gate: Example Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125478 4937 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125500 4937 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125505 4937 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125510 4937 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125515 4937 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125519 4937 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125525 4937 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125529 4937 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125534 4937 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125544 4937 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125548 4937 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125552 4937 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125559 4937 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125564 4937 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125570 4937 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125576 4937 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125581 4937 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.125587 4937 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.125603 4937 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.138408 4937 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.138468 4937 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138737 4937 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138757 4937 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138768 4937 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138777 4937 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138789 4937 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138800 4937 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138809 4937 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138817 4937 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138825 4937 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138834 4937 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138845 4937 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138856 4937 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138864 4937 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138874 4937 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138884 4937 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138894 4937 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138903 4937 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138912 4937 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138920 4937 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138929 4937 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138937 4937 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138948 4937 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138960 4937 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138969 4937 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138979 4937 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138988 4937 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.138997 4937 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139005 4937 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139012 4937 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139020 4937 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139028 4937 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139038 4937 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139048 4937 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139058 4937 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139071 4937 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139082 4937 feature_gate.go:330] unrecognized feature gate: Example Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139093 4937 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139103 4937 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139113 4937 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139122 4937 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139130 4937 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139138 4937 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139146 4937 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139154 4937 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139161 4937 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139169 4937 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139177 4937 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139185 4937 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139193 4937 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139204 4937 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139212 4937 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139220 4937 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139227 4937 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139235 4937 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139243 4937 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139251 4937 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139259 4937 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139267 4937 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139275 4937 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139283 4937 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139293 4937 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139302 4937 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139312 4937 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139320 4937 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139330 4937 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139338 4937 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139346 4937 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139355 4937 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139363 4937 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139370 4937 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139380 4937 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.139394 4937 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139645 4937 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139661 4937 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139670 4937 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139679 4937 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139687 4937 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139696 4937 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139703 4937 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139711 4937 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139721 4937 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139732 4937 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139744 4937 feature_gate.go:330] unrecognized feature gate: Example Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139753 4937 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139762 4937 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139771 4937 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139780 4937 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139788 4937 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139796 4937 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139804 4937 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139811 4937 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139819 4937 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139827 4937 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139835 4937 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139842 4937 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139850 4937 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139858 4937 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139866 4937 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139873 4937 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139883 4937 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139892 4937 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139900 4937 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139908 4937 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139916 4937 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139924 4937 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139932 4937 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139940 4937 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139948 4937 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139956 4937 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139964 4937 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139971 4937 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139979 4937 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.139990 4937 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140001 4937 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140011 4937 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140020 4937 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140029 4937 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140039 4937 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140050 4937 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140060 4937 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140070 4937 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140080 4937 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140089 4937 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140097 4937 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140105 4937 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140113 4937 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140121 4937 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140129 4937 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140138 4937 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140146 4937 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140154 4937 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140162 4937 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140170 4937 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140178 4937 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140186 4937 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140193 4937 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140201 4937 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140208 4937 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140216 4937 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140223 4937 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140231 4937 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140241 4937 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.140252 4937 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.140264 4937 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.141251 4937 server.go:940] "Client rotation is on, will bootstrap in background" Feb 25 15:45:50 crc kubenswrapper[4937]: E0225 15:45:50.146251 4937 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.152139 4937 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.153000 4937 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.155899 4937 server.go:997] "Starting client certificate rotation" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.155967 4937 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.156190 4937 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.282115 4937 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 25 15:45:50 crc kubenswrapper[4937]: E0225 15:45:50.308538 4937 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.309599 4937 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.357153 4937 log.go:25] "Validated CRI v1 runtime API" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.914949 4937 log.go:25] "Validated CRI v1 image API" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.917135 4937 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.922828 4937 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-25-15-40-46-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.922886 4937 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.941262 4937 manager.go:217] Machine: {Timestamp:2026-02-25 15:45:50.938733464 +0000 UTC m=+1.952125374 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:3428e233-7684-44b8-9625-ba84b10ba5fc BootID:64e5ecf3-73e7-446f-b239-94e67d809649 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:fb:50:57 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:fb:50:57 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b2:18:b2 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:2c:95:89 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ce:58:64 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:e0:dc:a5 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:2e:76:09:0b:02:c6 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d6:12:a2:15:6c:88 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.941567 4937 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.941756 4937 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.942384 4937 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.942704 4937 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.942748 4937 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.943025 4937 topology_manager.go:138] "Creating topology manager with none policy" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.943039 4937 container_manager_linux.go:303] "Creating device plugin manager" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.956221 4937 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.956270 4937 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.957247 4937 state_mem.go:36] "Initialized new in-memory state store" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.957403 4937 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.961819 4937 kubelet.go:418] "Attempting to sync node with API server" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.961851 4937 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.961904 4937 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.961923 4937 kubelet.go:324] "Adding apiserver pod source" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.961940 4937 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 25 15:45:50 crc kubenswrapper[4937]: I0225 15:45:50.968508 4937 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.969104 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:50 crc kubenswrapper[4937]: W0225 15:45:50.969158 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:50 crc kubenswrapper[4937]: E0225 15:45:50.969270 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:50 crc kubenswrapper[4937]: E0225 15:45:50.969242 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.035942 4937 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.068738 4937 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.261339 4937 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.261420 4937 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.261432 4937 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.261442 4937 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.261458 4937 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.261469 4937 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.261481 4937 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.261551 4937 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.261577 4937 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.261591 4937 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.261613 4937 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.261626 4937 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.262823 4937 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.263708 4937 server.go:1280] "Started kubelet" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.264043 4937 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.264074 4937 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.264572 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.265394 4937 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 25 15:45:51 crc systemd[1]: Started Kubernetes Kubelet. Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.267657 4937 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.267711 4937 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.268261 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.268291 4937 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.268432 4937 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.268518 4937 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.268848 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="200ms" Feb 25 15:45:51 crc kubenswrapper[4937]: W0225 15:45:51.270753 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.270939 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.272777 4937 server.go:460] "Adding debug handlers to kubelet server" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.273082 4937 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.130:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189787da94dbeeb6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.26364127 +0000 UTC m=+2.277033170,LastTimestamp:2026-02-25 15:45:51.26364127 +0000 UTC m=+2.277033170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.277531 4937 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.277571 4937 factory.go:55] Registering systemd factory Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.277585 4937 factory.go:221] Registration of the systemd container factory successfully Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.278872 4937 factory.go:153] Registering CRI-O factory Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.278890 4937 factory.go:221] Registration of the crio container factory successfully Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.278924 4937 factory.go:103] Registering Raw factory Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.278944 4937 manager.go:1196] Started watching for new ooms in manager Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.279679 4937 manager.go:319] Starting recovery of all containers Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294235 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294329 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294364 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294382 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294404 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294419 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294448 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294480 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294519 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294537 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294562 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294588 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294699 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294734 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294757 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294773 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294792 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294811 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294829 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294851 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294872 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294888 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294918 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294944 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294975 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.294998 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295031 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295053 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295083 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295101 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295124 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295141 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295168 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295182 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295197 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295218 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295243 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295258 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295278 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295292 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295310 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295324 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295344 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295364 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295389 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295409 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295421 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295433 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295457 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295472 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295506 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295522 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295548 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295569 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295584 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295606 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295630 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295649 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295662 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295679 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295700 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295723 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295746 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295761 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295781 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295794 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295815 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295835 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295855 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295873 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295906 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295922 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295941 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295957 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295977 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.295992 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.296011 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.296031 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.296044 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.298646 4937 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.298737 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.298767 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.298785 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.298815 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.298831 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.298847 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.298872 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.298892 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299067 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299083 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299108 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299138 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299156 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299189 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299203 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299217 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299338 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299353 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299370 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299383 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299396 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299410 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299423 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299441 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299453 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.299570 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322449 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322528 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322563 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322583 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322610 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322637 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322656 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322680 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322696 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322714 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322730 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322747 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322758 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322771 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322793 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322810 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322845 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322864 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.322885 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323190 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323211 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323234 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323260 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323281 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323323 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323342 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323362 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323383 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323398 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323421 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323445 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323462 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323473 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323505 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323520 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.323532 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.324742 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.324867 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.324935 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.324953 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.324967 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.324981 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.324994 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325009 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325024 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325040 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325053 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325070 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325083 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325095 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325112 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325126 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325140 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325154 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325170 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325185 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325199 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325213 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325227 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325240 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325256 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325271 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325289 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325312 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325329 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325348 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325371 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325389 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325405 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325419 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325432 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325448 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325464 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325506 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325532 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325550 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325562 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325579 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325592 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325624 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325635 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325649 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325664 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325676 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325689 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325700 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325718 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325730 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325752 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325766 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325779 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325791 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325805 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325820 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325835 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325851 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325867 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325881 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325893 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325907 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325921 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325948 4937 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325967 4937 reconstruct.go:97] "Volume reconstruction finished" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.325980 4937 reconciler.go:26] "Reconciler: start to sync state" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.326481 4937 manager.go:324] Recovery completed Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.339784 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.341600 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.341649 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.341662 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.342952 4937 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.342973 4937 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.343014 4937 state_mem.go:36] "Initialized new in-memory state store" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.363232 4937 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.366156 4937 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.366257 4937 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 25 15:45:51 crc kubenswrapper[4937]: I0225 15:45:51.366307 4937 kubelet.go:2335] "Starting kubelet main sync loop" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.366391 4937 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 25 15:45:51 crc kubenswrapper[4937]: W0225 15:45:51.367621 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.367700 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.368426 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.467535 4937 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.468698 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.469833 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="400ms" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.568797 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.667957 4937 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.669184 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.769287 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.869820 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.871542 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="800ms" Feb 25 15:45:51 crc kubenswrapper[4937]: E0225 15:45:51.969966 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.069082 4937 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.070235 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:45:52 crc kubenswrapper[4937]: W0225 15:45:52.122305 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.122448 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.171267 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:45:52 crc kubenswrapper[4937]: W0225 15:45:52.187124 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.187264 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.244692 4937 policy_none.go:49] "None policy: Start" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.246140 4937 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.246214 4937 state_mem.go:35] "Initializing new in-memory state store" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.266358 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.272038 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:45:52 crc kubenswrapper[4937]: W0225 15:45:52.361525 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.361615 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.372173 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.403480 4937 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.404753 4937 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:52 crc kubenswrapper[4937]: W0225 15:45:52.424044 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.424150 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.472978 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.526152 4937 manager.go:334] "Starting Device Plugin manager" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.526232 4937 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.526255 4937 server.go:79] "Starting device plugin registration server" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.527469 4937 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.527689 4937 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.528216 4937 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.528454 4937 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.528469 4937 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.543925 4937 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.629244 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.631169 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.631208 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.631219 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.631243 4937 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.631819 4937 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.130:6443: connect: connection refused" node="crc" Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.672599 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="1.6s" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.832401 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.834309 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.834355 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.834368 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.834403 4937 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 15:45:52 crc kubenswrapper[4937]: E0225 15:45:52.834990 4937 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.130:6443: connect: connection refused" node="crc" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.869820 4937 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.870029 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.872201 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.872265 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.872289 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.872576 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.872966 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.873028 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.874089 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.874124 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.874142 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.874163 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.874208 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.874233 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.874469 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.874558 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.874666 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.876362 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.876434 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.876457 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.876551 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.876610 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.876633 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.876861 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.877072 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.877236 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.878305 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.878365 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.878390 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.878694 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.878947 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.878973 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.878982 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.879037 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.879199 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.879997 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.880030 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.880041 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.880260 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.880296 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.880753 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.880909 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.881024 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.881027 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.881228 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:52 crc kubenswrapper[4937]: I0225 15:45:52.881238 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.044631 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.045203 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.045235 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.045272 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.045304 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.045326 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.045369 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.045452 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.045556 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.045612 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.045760 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.045897 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.045959 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.046010 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.046054 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147436 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147550 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147594 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147630 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147668 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147702 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147739 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147743 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147845 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147783 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147923 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147914 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147946 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147985 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.148031 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147994 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147950 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.148076 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.147977 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.148121 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.148071 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.148083 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.148209 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.148253 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.148284 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.148319 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.148786 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.148838 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.148876 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.148933 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.234167 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.235894 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.238789 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.238869 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.238895 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.238945 4937 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 15:45:53 crc kubenswrapper[4937]: E0225 15:45:53.239929 4937 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.130:6443: connect: connection refused" node="crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.266129 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.268285 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.293569 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.311399 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.312574 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:45:53 crc kubenswrapper[4937]: W0225 15:45:53.322197 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-40ec71075eae1dd1f1ffa6fb00921807e5af61e79b89aefc678a130ea9eb7a31 WatchSource:0}: Error finding container 40ec71075eae1dd1f1ffa6fb00921807e5af61e79b89aefc678a130ea9eb7a31: Status 404 returned error can't find the container with id 40ec71075eae1dd1f1ffa6fb00921807e5af61e79b89aefc678a130ea9eb7a31 Feb 25 15:45:53 crc kubenswrapper[4937]: W0225 15:45:53.326576 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-81ea5e90db0d84092b579209d2da8efb9ac246294698b34729b84b7c352962b8 WatchSource:0}: Error finding container 81ea5e90db0d84092b579209d2da8efb9ac246294698b34729b84b7c352962b8: Status 404 returned error can't find the container with id 81ea5e90db0d84092b579209d2da8efb9ac246294698b34729b84b7c352962b8 Feb 25 15:45:53 crc kubenswrapper[4937]: W0225 15:45:53.338888 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-14f552fbb229f196702ec6cfc684298bfbffa2733a04cc158f4251c9fcdb16dd WatchSource:0}: Error finding container 14f552fbb229f196702ec6cfc684298bfbffa2733a04cc158f4251c9fcdb16dd: Status 404 returned error can't find the container with id 14f552fbb229f196702ec6cfc684298bfbffa2733a04cc158f4251c9fcdb16dd Feb 25 15:45:53 crc kubenswrapper[4937]: W0225 15:45:53.341654 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-c4a1d0bf22ebb177d2bee101c8bb878eeb41450ce654a9f52fcfce5222466cad WatchSource:0}: Error finding container c4a1d0bf22ebb177d2bee101c8bb878eeb41450ce654a9f52fcfce5222466cad: Status 404 returned error can't find the container with id c4a1d0bf22ebb177d2bee101c8bb878eeb41450ce654a9f52fcfce5222466cad Feb 25 15:45:53 crc kubenswrapper[4937]: W0225 15:45:53.345978 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-c1807f6710b197667138506654388f0ab53907e2c38813bf266d617f10fec5ef WatchSource:0}: Error finding container c1807f6710b197667138506654388f0ab53907e2c38813bf266d617f10fec5ef: Status 404 returned error can't find the container with id c1807f6710b197667138506654388f0ab53907e2c38813bf266d617f10fec5ef Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.375107 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"40ec71075eae1dd1f1ffa6fb00921807e5af61e79b89aefc678a130ea9eb7a31"} Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.376195 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c1807f6710b197667138506654388f0ab53907e2c38813bf266d617f10fec5ef"} Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.378003 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c4a1d0bf22ebb177d2bee101c8bb878eeb41450ce654a9f52fcfce5222466cad"} Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.378884 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"14f552fbb229f196702ec6cfc684298bfbffa2733a04cc158f4251c9fcdb16dd"} Feb 25 15:45:53 crc kubenswrapper[4937]: I0225 15:45:53.379613 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"81ea5e90db0d84092b579209d2da8efb9ac246294698b34729b84b7c352962b8"} Feb 25 15:45:53 crc kubenswrapper[4937]: W0225 15:45:53.937895 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:53 crc kubenswrapper[4937]: E0225 15:45:53.938013 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:54 crc kubenswrapper[4937]: I0225 15:45:54.040640 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:54 crc kubenswrapper[4937]: I0225 15:45:54.042405 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:54 crc kubenswrapper[4937]: I0225 15:45:54.042448 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:54 crc kubenswrapper[4937]: I0225 15:45:54.042460 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:54 crc kubenswrapper[4937]: I0225 15:45:54.042507 4937 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 15:45:54 crc kubenswrapper[4937]: E0225 15:45:54.043042 4937 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.130:6443: connect: connection refused" node="crc" Feb 25 15:45:54 crc kubenswrapper[4937]: I0225 15:45:54.265773 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:54 crc kubenswrapper[4937]: E0225 15:45:54.273590 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="3.2s" Feb 25 15:45:54 crc kubenswrapper[4937]: W0225 15:45:54.822599 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:54 crc kubenswrapper[4937]: E0225 15:45:54.823323 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:55 crc kubenswrapper[4937]: W0225 15:45:55.043864 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:55 crc kubenswrapper[4937]: E0225 15:45:55.043991 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:55 crc kubenswrapper[4937]: W0225 15:45:55.257988 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:55 crc kubenswrapper[4937]: E0225 15:45:55.258180 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.266914 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.385776 4937 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2" exitCode=0 Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.385864 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2"} Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.385941 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.387159 4937 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="b0df89d85c9e461bd8a205ce20ba9c2e6b46dd742027eaa0af058a84a6a2bd52" exitCode=0 Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.387255 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.387251 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"b0df89d85c9e461bd8a205ce20ba9c2e6b46dd742027eaa0af058a84a6a2bd52"} Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.387659 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.387684 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.387695 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.388872 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.388898 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.388909 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.390435 4937 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3" exitCode=0 Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.390524 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3"} Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.390596 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.402162 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.402206 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.402219 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.402618 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75"} Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.402646 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018"} Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.404685 4937 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c" exitCode=0 Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.404741 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c"} Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.404786 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.406409 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.406442 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.406454 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.410275 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.411345 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.411399 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.411412 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.643855 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.645340 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.645372 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.645381 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:55 crc kubenswrapper[4937]: I0225 15:45:55.645407 4937 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 15:45:55 crc kubenswrapper[4937]: E0225 15:45:55.645982 4937 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.130:6443: connect: connection refused" node="crc" Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.265549 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.410441 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"169381c1c0271e964095447cf0d45da0a846d0f98f2e5a70395b5415f6c81524"} Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.410621 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.412086 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.412147 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.412188 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.414289 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9"} Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.414343 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1"} Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.417556 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c"} Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.417609 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb"} Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.417616 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.418695 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.418731 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.418748 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.420446 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa"} Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.420542 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea"} Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.422933 4937 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced" exitCode=0 Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.422983 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced"} Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.423091 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.424296 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.424347 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.424365 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:56 crc kubenswrapper[4937]: I0225 15:45:56.696297 4937 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 25 15:45:56 crc kubenswrapper[4937]: E0225 15:45:56.697250 4937 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.265948 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:57 crc kubenswrapper[4937]: E0225 15:45:57.474768 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="6.4s" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.491184 4937 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff" exitCode=0 Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.491271 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff"} Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.491401 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.492530 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.492561 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.492574 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.495683 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0"} Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.495957 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.497826 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.497872 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.497891 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.502744 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.503425 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.503984 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e9fed912c3e122fa040f93102e35c74e835205f89fe58ce3dd216808511810aa"} Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.504028 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9"} Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.504050 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925"} Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.504143 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.505307 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.505388 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.505406 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.505424 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.505462 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.505479 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.506154 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.506251 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:57 crc kubenswrapper[4937]: I0225 15:45:57.506333 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:57 crc kubenswrapper[4937]: E0225 15:45:57.722933 4937 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.130:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189787da94dbeeb6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.26364127 +0000 UTC m=+2.277033170,LastTimestamp:2026-02-25 15:45:51.26364127 +0000 UTC m=+2.277033170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:45:58 crc kubenswrapper[4937]: W0225 15:45:58.239067 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:58 crc kubenswrapper[4937]: E0225 15:45:58.239185 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.266341 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:58 crc kubenswrapper[4937]: W0225 15:45:58.294694 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:58 crc kubenswrapper[4937]: E0225 15:45:58.294781 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:58 crc kubenswrapper[4937]: W0225 15:45:58.482862 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:58 crc kubenswrapper[4937]: E0225 15:45:58.483034 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.510151 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd"} Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.510445 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d"} Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.510325 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.510591 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.510674 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f"} Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.510339 4937 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.510821 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.511687 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.511746 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.511762 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.511939 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.512019 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.512083 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.558204 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.558386 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.559846 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.559964 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.560070 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.846793 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.848155 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.848213 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.848227 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:58 crc kubenswrapper[4937]: I0225 15:45:58.848260 4937 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 15:45:58 crc kubenswrapper[4937]: E0225 15:45:58.848882 4937 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.130:6443: connect: connection refused" node="crc" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.169952 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.265362 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.518599 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c"} Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.518666 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205"} Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.518724 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.519839 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.519894 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.519913 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.520617 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.522554 4937 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e9fed912c3e122fa040f93102e35c74e835205f89fe58ce3dd216808511810aa" exitCode=255 Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.522649 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e9fed912c3e122fa040f93102e35c74e835205f89fe58ce3dd216808511810aa"} Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.522722 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.522757 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.523669 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.523721 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.523738 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.568386 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.568439 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.568457 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:45:59 crc kubenswrapper[4937]: I0225 15:45:59.569438 4937 scope.go:117] "RemoveContainer" containerID="e9fed912c3e122fa040f93102e35c74e835205f89fe58ce3dd216808511810aa" Feb 25 15:46:00 crc kubenswrapper[4937]: I0225 15:46:00.341579 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:46:00 crc kubenswrapper[4937]: I0225 15:46:00.528664 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 25 15:46:00 crc kubenswrapper[4937]: I0225 15:46:00.531739 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"02d139956165b67982f337eb183f7ae1609bdd22fd03f70789b3745f263e29c4"} Feb 25 15:46:00 crc kubenswrapper[4937]: I0225 15:46:00.531790 4937 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 15:46:00 crc kubenswrapper[4937]: I0225 15:46:00.531827 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:00 crc kubenswrapper[4937]: I0225 15:46:00.531835 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:00 crc kubenswrapper[4937]: I0225 15:46:00.533094 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:00 crc kubenswrapper[4937]: I0225 15:46:00.533133 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:00 crc kubenswrapper[4937]: I0225 15:46:00.533146 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:00 crc kubenswrapper[4937]: I0225 15:46:00.533669 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:00 crc kubenswrapper[4937]: I0225 15:46:00.533721 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:00 crc kubenswrapper[4937]: I0225 15:46:00.533740 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:00 crc kubenswrapper[4937]: I0225 15:46:00.617910 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:46:01 crc kubenswrapper[4937]: I0225 15:46:01.534232 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:01 crc kubenswrapper[4937]: I0225 15:46:01.535651 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:01 crc kubenswrapper[4937]: I0225 15:46:01.535693 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:01 crc kubenswrapper[4937]: I0225 15:46:01.535705 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:01 crc kubenswrapper[4937]: I0225 15:46:01.632739 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:46:01 crc kubenswrapper[4937]: I0225 15:46:01.632997 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:01 crc kubenswrapper[4937]: I0225 15:46:01.634597 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:01 crc kubenswrapper[4937]: I0225 15:46:01.634693 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:01 crc kubenswrapper[4937]: I0225 15:46:01.634721 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:01 crc kubenswrapper[4937]: I0225 15:46:01.899224 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:46:02 crc kubenswrapper[4937]: I0225 15:46:02.536283 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:02 crc kubenswrapper[4937]: I0225 15:46:02.537093 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:02 crc kubenswrapper[4937]: I0225 15:46:02.537130 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:02 crc kubenswrapper[4937]: I0225 15:46:02.537140 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:02 crc kubenswrapper[4937]: E0225 15:46:02.544048 4937 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 15:46:02 crc kubenswrapper[4937]: I0225 15:46:02.888723 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:46:02 crc kubenswrapper[4937]: I0225 15:46:02.888969 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:02 crc kubenswrapper[4937]: I0225 15:46:02.890118 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:02 crc kubenswrapper[4937]: I0225 15:46:02.890157 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:02 crc kubenswrapper[4937]: I0225 15:46:02.890169 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:02 crc kubenswrapper[4937]: I0225 15:46:02.901629 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:46:03 crc kubenswrapper[4937]: I0225 15:46:03.540123 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:03 crc kubenswrapper[4937]: I0225 15:46:03.540282 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:03 crc kubenswrapper[4937]: I0225 15:46:03.541981 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:03 crc kubenswrapper[4937]: I0225 15:46:03.542015 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:03 crc kubenswrapper[4937]: I0225 15:46:03.542027 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:03 crc kubenswrapper[4937]: I0225 15:46:03.542117 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:03 crc kubenswrapper[4937]: I0225 15:46:03.542142 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:03 crc kubenswrapper[4937]: I0225 15:46:03.542154 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:04 crc kubenswrapper[4937]: I0225 15:46:04.168145 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 25 15:46:04 crc kubenswrapper[4937]: I0225 15:46:04.168454 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:04 crc kubenswrapper[4937]: I0225 15:46:04.170262 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:04 crc kubenswrapper[4937]: I0225 15:46:04.170326 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:04 crc kubenswrapper[4937]: I0225 15:46:04.170342 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:04 crc kubenswrapper[4937]: I0225 15:46:04.402134 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 25 15:46:04 crc kubenswrapper[4937]: I0225 15:46:04.543332 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:04 crc kubenswrapper[4937]: I0225 15:46:04.544723 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:04 crc kubenswrapper[4937]: I0225 15:46:04.544781 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:04 crc kubenswrapper[4937]: I0225 15:46:04.544801 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:05 crc kubenswrapper[4937]: I0225 15:46:05.123477 4937 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 25 15:46:05 crc kubenswrapper[4937]: I0225 15:46:05.249979 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:05 crc kubenswrapper[4937]: I0225 15:46:05.251929 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:05 crc kubenswrapper[4937]: I0225 15:46:05.251992 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:05 crc kubenswrapper[4937]: I0225 15:46:05.252002 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:05 crc kubenswrapper[4937]: I0225 15:46:05.252024 4937 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 15:46:06 crc kubenswrapper[4937]: I0225 15:46:06.197564 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:46:06 crc kubenswrapper[4937]: I0225 15:46:06.197831 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:06 crc kubenswrapper[4937]: I0225 15:46:06.200021 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:06 crc kubenswrapper[4937]: I0225 15:46:06.200083 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:06 crc kubenswrapper[4937]: I0225 15:46:06.200093 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:06 crc kubenswrapper[4937]: I0225 15:46:06.204146 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:46:06 crc kubenswrapper[4937]: I0225 15:46:06.553461 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:06 crc kubenswrapper[4937]: I0225 15:46:06.554972 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:06 crc kubenswrapper[4937]: I0225 15:46:06.555034 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:06 crc kubenswrapper[4937]: I0225 15:46:06.555054 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:09 crc kubenswrapper[4937]: I0225 15:46:09.197750 4937 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 15:46:09 crc kubenswrapper[4937]: I0225 15:46:09.197893 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 15:46:10 crc kubenswrapper[4937]: W0225 15:46:10.051314 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 25 15:46:10 crc kubenswrapper[4937]: I0225 15:46:10.051555 4937 trace.go:236] Trace[837714242]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Feb-2026 15:46:00.049) (total time: 10001ms): Feb 25 15:46:10 crc kubenswrapper[4937]: Trace[837714242]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:46:10.051) Feb 25 15:46:10 crc kubenswrapper[4937]: Trace[837714242]: [10.001453605s] [10.001453605s] END Feb 25 15:46:10 crc kubenswrapper[4937]: E0225 15:46:10.051596 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 25 15:46:10 crc kubenswrapper[4937]: I0225 15:46:10.202765 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:10Z is after 2026-02-23T05:33:13Z Feb 25 15:46:10 crc kubenswrapper[4937]: W0225 15:46:10.209087 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:10Z is after 2026-02-23T05:33:13Z Feb 25 15:46:10 crc kubenswrapper[4937]: E0225 15:46:10.209173 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 15:46:10 crc kubenswrapper[4937]: E0225 15:46:10.212743 4937 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 15:46:10 crc kubenswrapper[4937]: E0225 15:46:10.212988 4937 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:10Z is after 2026-02-23T05:33:13Z" node="crc" Feb 25 15:46:10 crc kubenswrapper[4937]: E0225 15:46:10.225200 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:10Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 25 15:46:10 crc kubenswrapper[4937]: W0225 15:46:10.228891 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:10Z is after 2026-02-23T05:33:13Z Feb 25 15:46:10 crc kubenswrapper[4937]: E0225 15:46:10.228969 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 15:46:10 crc kubenswrapper[4937]: E0225 15:46:10.232440 4937 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:10Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189787da94dbeeb6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.26364127 +0000 UTC m=+2.277033170,LastTimestamp:2026-02-25 15:45:51.26364127 +0000 UTC m=+2.277033170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:10 crc kubenswrapper[4937]: W0225 15:46:10.235977 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:10Z is after 2026-02-23T05:33:13Z Feb 25 15:46:10 crc kubenswrapper[4937]: E0225 15:46:10.236055 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 15:46:10 crc kubenswrapper[4937]: I0225 15:46:10.243181 4937 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 25 15:46:10 crc kubenswrapper[4937]: I0225 15:46:10.243246 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 25 15:46:10 crc kubenswrapper[4937]: I0225 15:46:10.254839 4937 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 25 15:46:10 crc kubenswrapper[4937]: I0225 15:46:10.254965 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 25 15:46:10 crc kubenswrapper[4937]: I0225 15:46:10.269331 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:10Z is after 2026-02-23T05:33:13Z Feb 25 15:46:10 crc kubenswrapper[4937]: I0225 15:46:10.641732 4937 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50634->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 25 15:46:10 crc kubenswrapper[4937]: I0225 15:46:10.641804 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50634->192.168.126.11:17697: read: connection reset by peer" Feb 25 15:46:10 crc kubenswrapper[4937]: I0225 15:46:10.641871 4937 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50638->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 25 15:46:10 crc kubenswrapper[4937]: I0225 15:46:10.641960 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50638->192.168.126.11:17697: read: connection reset by peer" Feb 25 15:46:11 crc kubenswrapper[4937]: I0225 15:46:11.267840 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:11Z is after 2026-02-23T05:33:13Z Feb 25 15:46:11 crc kubenswrapper[4937]: I0225 15:46:11.568222 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 25 15:46:11 crc kubenswrapper[4937]: I0225 15:46:11.568826 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 25 15:46:11 crc kubenswrapper[4937]: I0225 15:46:11.570567 4937 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="02d139956165b67982f337eb183f7ae1609bdd22fd03f70789b3745f263e29c4" exitCode=255 Feb 25 15:46:11 crc kubenswrapper[4937]: I0225 15:46:11.570602 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"02d139956165b67982f337eb183f7ae1609bdd22fd03f70789b3745f263e29c4"} Feb 25 15:46:11 crc kubenswrapper[4937]: I0225 15:46:11.570653 4937 scope.go:117] "RemoveContainer" containerID="e9fed912c3e122fa040f93102e35c74e835205f89fe58ce3dd216808511810aa" Feb 25 15:46:11 crc kubenswrapper[4937]: I0225 15:46:11.570910 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:11 crc kubenswrapper[4937]: I0225 15:46:11.572187 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:11 crc kubenswrapper[4937]: I0225 15:46:11.572210 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:11 crc kubenswrapper[4937]: I0225 15:46:11.572219 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:11 crc kubenswrapper[4937]: I0225 15:46:11.572711 4937 scope.go:117] "RemoveContainer" containerID="02d139956165b67982f337eb183f7ae1609bdd22fd03f70789b3745f263e29c4" Feb 25 15:46:11 crc kubenswrapper[4937]: E0225 15:46:11.572943 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 15:46:11 crc kubenswrapper[4937]: I0225 15:46:11.907710 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:46:12 crc kubenswrapper[4937]: I0225 15:46:12.268097 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:12Z is after 2026-02-23T05:33:13Z Feb 25 15:46:12 crc kubenswrapper[4937]: E0225 15:46:12.544195 4937 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 15:46:12 crc kubenswrapper[4937]: I0225 15:46:12.574543 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 25 15:46:12 crc kubenswrapper[4937]: I0225 15:46:12.577215 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:12 crc kubenswrapper[4937]: I0225 15:46:12.579393 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:12 crc kubenswrapper[4937]: I0225 15:46:12.579451 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:12 crc kubenswrapper[4937]: I0225 15:46:12.579467 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:12 crc kubenswrapper[4937]: I0225 15:46:12.580277 4937 scope.go:117] "RemoveContainer" containerID="02d139956165b67982f337eb183f7ae1609bdd22fd03f70789b3745f263e29c4" Feb 25 15:46:12 crc kubenswrapper[4937]: E0225 15:46:12.580552 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 15:46:12 crc kubenswrapper[4937]: I0225 15:46:12.581594 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:46:13 crc kubenswrapper[4937]: I0225 15:46:13.269455 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:13Z is after 2026-02-23T05:33:13Z Feb 25 15:46:13 crc kubenswrapper[4937]: I0225 15:46:13.579803 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:13 crc kubenswrapper[4937]: I0225 15:46:13.581394 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:13 crc kubenswrapper[4937]: I0225 15:46:13.581530 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:13 crc kubenswrapper[4937]: I0225 15:46:13.581552 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:13 crc kubenswrapper[4937]: I0225 15:46:13.582613 4937 scope.go:117] "RemoveContainer" containerID="02d139956165b67982f337eb183f7ae1609bdd22fd03f70789b3745f263e29c4" Feb 25 15:46:13 crc kubenswrapper[4937]: E0225 15:46:13.582931 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 15:46:14 crc kubenswrapper[4937]: I0225 15:46:14.205128 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 25 15:46:14 crc kubenswrapper[4937]: I0225 15:46:14.205350 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:14 crc kubenswrapper[4937]: I0225 15:46:14.206799 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:14 crc kubenswrapper[4937]: I0225 15:46:14.206871 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:14 crc kubenswrapper[4937]: I0225 15:46:14.206890 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:14 crc kubenswrapper[4937]: I0225 15:46:14.220118 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 25 15:46:14 crc kubenswrapper[4937]: I0225 15:46:14.268897 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:14Z is after 2026-02-23T05:33:13Z Feb 25 15:46:14 crc kubenswrapper[4937]: I0225 15:46:14.582277 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:14 crc kubenswrapper[4937]: I0225 15:46:14.583041 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:14 crc kubenswrapper[4937]: I0225 15:46:14.583076 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:14 crc kubenswrapper[4937]: I0225 15:46:14.583085 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:15 crc kubenswrapper[4937]: I0225 15:46:15.274820 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:15Z is after 2026-02-23T05:33:13Z Feb 25 15:46:16 crc kubenswrapper[4937]: I0225 15:46:16.270009 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:16Z is after 2026-02-23T05:33:13Z Feb 25 15:46:17 crc kubenswrapper[4937]: I0225 15:46:17.214038 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:17 crc kubenswrapper[4937]: I0225 15:46:17.215917 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:17 crc kubenswrapper[4937]: I0225 15:46:17.215992 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:17 crc kubenswrapper[4937]: I0225 15:46:17.216012 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:17 crc kubenswrapper[4937]: I0225 15:46:17.216056 4937 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 15:46:17 crc kubenswrapper[4937]: E0225 15:46:17.221860 4937 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:17Z is after 2026-02-23T05:33:13Z" node="crc" Feb 25 15:46:17 crc kubenswrapper[4937]: E0225 15:46:17.231219 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:17Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 25 15:46:17 crc kubenswrapper[4937]: I0225 15:46:17.269608 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:17Z is after 2026-02-23T05:33:13Z Feb 25 15:46:18 crc kubenswrapper[4937]: I0225 15:46:18.269601 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:18Z is after 2026-02-23T05:33:13Z Feb 25 15:46:19 crc kubenswrapper[4937]: I0225 15:46:19.198084 4937 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 15:46:19 crc kubenswrapper[4937]: I0225 15:46:19.198215 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 15:46:19 crc kubenswrapper[4937]: I0225 15:46:19.271031 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:19Z is after 2026-02-23T05:33:13Z Feb 25 15:46:20 crc kubenswrapper[4937]: E0225 15:46:20.239829 4937 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:20Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189787da94dbeeb6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.26364127 +0000 UTC m=+2.277033170,LastTimestamp:2026-02-25 15:45:51.26364127 +0000 UTC m=+2.277033170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:20 crc kubenswrapper[4937]: I0225 15:46:20.270737 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:20Z is after 2026-02-23T05:33:13Z Feb 25 15:46:20 crc kubenswrapper[4937]: I0225 15:46:20.342616 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:46:20 crc kubenswrapper[4937]: I0225 15:46:20.342965 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:20 crc kubenswrapper[4937]: I0225 15:46:20.345083 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:20 crc kubenswrapper[4937]: I0225 15:46:20.345139 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:20 crc kubenswrapper[4937]: I0225 15:46:20.345157 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:20 crc kubenswrapper[4937]: I0225 15:46:20.346029 4937 scope.go:117] "RemoveContainer" containerID="02d139956165b67982f337eb183f7ae1609bdd22fd03f70789b3745f263e29c4" Feb 25 15:46:20 crc kubenswrapper[4937]: E0225 15:46:20.346352 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 15:46:21 crc kubenswrapper[4937]: I0225 15:46:21.271701 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:21Z is after 2026-02-23T05:33:13Z Feb 25 15:46:21 crc kubenswrapper[4937]: W0225 15:46:21.751865 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:21Z is after 2026-02-23T05:33:13Z Feb 25 15:46:21 crc kubenswrapper[4937]: E0225 15:46:21.751999 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 15:46:22 crc kubenswrapper[4937]: I0225 15:46:22.269409 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:22Z is after 2026-02-23T05:33:13Z Feb 25 15:46:22 crc kubenswrapper[4937]: E0225 15:46:22.545257 4937 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 15:46:23 crc kubenswrapper[4937]: I0225 15:46:23.271440 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:23Z is after 2026-02-23T05:33:13Z Feb 25 15:46:24 crc kubenswrapper[4937]: I0225 15:46:24.222378 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:24 crc kubenswrapper[4937]: I0225 15:46:24.224343 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:24 crc kubenswrapper[4937]: I0225 15:46:24.224404 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:24 crc kubenswrapper[4937]: I0225 15:46:24.224424 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:24 crc kubenswrapper[4937]: I0225 15:46:24.224461 4937 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 15:46:24 crc kubenswrapper[4937]: E0225 15:46:24.229441 4937 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:24Z is after 2026-02-23T05:33:13Z" node="crc" Feb 25 15:46:24 crc kubenswrapper[4937]: E0225 15:46:24.236742 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:24Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 25 15:46:24 crc kubenswrapper[4937]: I0225 15:46:24.267997 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:24Z is after 2026-02-23T05:33:13Z Feb 25 15:46:25 crc kubenswrapper[4937]: I0225 15:46:25.268806 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:46:25Z is after 2026-02-23T05:33:13Z Feb 25 15:46:26 crc kubenswrapper[4937]: I0225 15:46:26.272283 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:26 crc kubenswrapper[4937]: I0225 15:46:26.351328 4937 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 25 15:46:26 crc kubenswrapper[4937]: I0225 15:46:26.375300 4937 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 25 15:46:26 crc kubenswrapper[4937]: W0225 15:46:26.554962 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 25 15:46:26 crc kubenswrapper[4937]: E0225 15:46:26.555337 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 25 15:46:27 crc kubenswrapper[4937]: W0225 15:46:27.044699 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 25 15:46:27 crc kubenswrapper[4937]: E0225 15:46:27.044793 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 25 15:46:27 crc kubenswrapper[4937]: I0225 15:46:27.272524 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:27 crc kubenswrapper[4937]: I0225 15:46:27.761269 4937 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:54770->192.168.126.11:10357: read: connection reset by peer" start-of-body= Feb 25 15:46:27 crc kubenswrapper[4937]: I0225 15:46:27.761944 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:54770->192.168.126.11:10357: read: connection reset by peer" Feb 25 15:46:27 crc kubenswrapper[4937]: I0225 15:46:27.762028 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:46:27 crc kubenswrapper[4937]: I0225 15:46:27.762233 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:27 crc kubenswrapper[4937]: I0225 15:46:27.763630 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:27 crc kubenswrapper[4937]: I0225 15:46:27.763706 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:27 crc kubenswrapper[4937]: I0225 15:46:27.763720 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:27 crc kubenswrapper[4937]: I0225 15:46:27.764662 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 25 15:46:27 crc kubenswrapper[4937]: I0225 15:46:27.764942 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75" gracePeriod=30 Feb 25 15:46:28 crc kubenswrapper[4937]: I0225 15:46:28.272660 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:28 crc kubenswrapper[4937]: I0225 15:46:28.626734 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 25 15:46:28 crc kubenswrapper[4937]: I0225 15:46:28.627478 4937 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75" exitCode=255 Feb 25 15:46:28 crc kubenswrapper[4937]: I0225 15:46:28.627534 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75"} Feb 25 15:46:28 crc kubenswrapper[4937]: I0225 15:46:28.627558 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65"} Feb 25 15:46:28 crc kubenswrapper[4937]: I0225 15:46:28.627687 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:28 crc kubenswrapper[4937]: I0225 15:46:28.628713 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:28 crc kubenswrapper[4937]: I0225 15:46:28.628747 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:28 crc kubenswrapper[4937]: I0225 15:46:28.628757 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:29 crc kubenswrapper[4937]: I0225 15:46:29.273392 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.245215 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da94dbeeb6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.26364127 +0000 UTC m=+2.277033170,LastTimestamp:2026-02-25 15:45:51.26364127 +0000 UTC m=+2.277033170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.251816 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da9982108c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341637772 +0000 UTC m=+2.355029662,LastTimestamp:2026-02-25 15:45:51.341637772 +0000 UTC m=+2.355029662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.259534 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da998259fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341656573 +0000 UTC m=+2.355048463,LastTimestamp:2026-02-25 15:45:51.341656573 +0000 UTC m=+2.355048463,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: I0225 15:46:30.266312 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.266432 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da99828653 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341667923 +0000 UTC m=+2.355059813,LastTimestamp:2026-02-25 15:45:51.341667923 +0000 UTC m=+2.355059813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.271261 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787dae0832af7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:52.532892407 +0000 UTC m=+3.546284297,LastTimestamp:2026-02-25 15:45:52.532892407 +0000 UTC m=+3.546284297,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.278260 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da9982108c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da9982108c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341637772 +0000 UTC m=+2.355029662,LastTimestamp:2026-02-25 15:45:52.631198566 +0000 UTC m=+3.644590446,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.285426 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da998259fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da998259fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341656573 +0000 UTC m=+2.355048463,LastTimestamp:2026-02-25 15:45:52.631215127 +0000 UTC m=+3.644607017,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.292102 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da99828653\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da99828653 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341667923 +0000 UTC m=+2.355059813,LastTimestamp:2026-02-25 15:45:52.631224127 +0000 UTC m=+3.644616017,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.299306 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da9982108c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da9982108c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341637772 +0000 UTC m=+2.355029662,LastTimestamp:2026-02-25 15:45:52.834339545 +0000 UTC m=+3.847731435,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.306183 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da998259fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da998259fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341656573 +0000 UTC m=+2.355048463,LastTimestamp:2026-02-25 15:45:52.834362665 +0000 UTC m=+3.847754555,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.311073 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da99828653\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da99828653 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341667923 +0000 UTC m=+2.355059813,LastTimestamp:2026-02-25 15:45:52.834374456 +0000 UTC m=+3.847766346,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.316986 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da9982108c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da9982108c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341637772 +0000 UTC m=+2.355029662,LastTimestamp:2026-02-25 15:45:52.872240917 +0000 UTC m=+3.885632847,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.324602 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da998259fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da998259fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341656573 +0000 UTC m=+2.355048463,LastTimestamp:2026-02-25 15:45:52.872278948 +0000 UTC m=+3.885670868,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.331873 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da99828653\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da99828653 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341667923 +0000 UTC m=+2.355059813,LastTimestamp:2026-02-25 15:45:52.872300529 +0000 UTC m=+3.885692459,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.339339 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da9982108c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da9982108c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341637772 +0000 UTC m=+2.355029662,LastTimestamp:2026-02-25 15:45:52.874116056 +0000 UTC m=+3.887507986,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.346132 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da998259fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da998259fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341656573 +0000 UTC m=+2.355048463,LastTimestamp:2026-02-25 15:45:52.874135017 +0000 UTC m=+3.887526947,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.354095 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da99828653\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da99828653 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341667923 +0000 UTC m=+2.355059813,LastTimestamp:2026-02-25 15:45:52.874152067 +0000 UTC m=+3.887543987,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.361105 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da9982108c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da9982108c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341637772 +0000 UTC m=+2.355029662,LastTimestamp:2026-02-25 15:45:52.874191058 +0000 UTC m=+3.887582988,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.368111 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da998259fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da998259fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341656573 +0000 UTC m=+2.355048463,LastTimestamp:2026-02-25 15:45:52.874222209 +0000 UTC m=+3.887614139,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.375605 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da99828653\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da99828653 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341667923 +0000 UTC m=+2.355059813,LastTimestamp:2026-02-25 15:45:52.874246319 +0000 UTC m=+3.887638249,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.381947 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da9982108c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da9982108c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341637772 +0000 UTC m=+2.355029662,LastTimestamp:2026-02-25 15:45:52.876393445 +0000 UTC m=+3.889785365,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.388468 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da998259fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da998259fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341656573 +0000 UTC m=+2.355048463,LastTimestamp:2026-02-25 15:45:52.876447717 +0000 UTC m=+3.889839647,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.395694 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da99828653\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da99828653 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341667923 +0000 UTC m=+2.355059813,LastTimestamp:2026-02-25 15:45:52.876553579 +0000 UTC m=+3.889945509,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.402819 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da9982108c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da9982108c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341637772 +0000 UTC m=+2.355029662,LastTimestamp:2026-02-25 15:45:52.876598661 +0000 UTC m=+3.889990571,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.408288 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189787da998259fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189787da998259fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:51.341656573 +0000 UTC m=+2.355048463,LastTimestamp:2026-02-25 15:45:52.876621011 +0000 UTC m=+3.890013141,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.415019 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787db10e118a8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:53.344354472 +0000 UTC m=+4.357746402,LastTimestamp:2026-02-25 15:45:53.344354472 +0000 UTC m=+4.357746402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.421480 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189787db10e2d1cd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:53.344467405 +0000 UTC m=+4.357859335,LastTimestamp:2026-02-25 15:45:53.344467405 +0000 UTC m=+4.357859335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.427053 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189787db10f54a99 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:53.345677977 +0000 UTC m=+4.359069907,LastTimestamp:2026-02-25 15:45:53.345677977 +0000 UTC m=+4.359069907,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.432182 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787db11380b0c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:53.35005262 +0000 UTC m=+4.363444510,LastTimestamp:2026-02-25 15:45:53.35005262 +0000 UTC m=+4.363444510,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.437601 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787db119c69e5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:53.356630501 +0000 UTC m=+4.370022421,LastTimestamp:2026-02-25 15:45:53.356630501 +0000 UTC m=+4.370022421,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.444757 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787db5fc969e2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:54.668202466 +0000 UTC m=+5.681594396,LastTimestamp:2026-02-25 15:45:54.668202466 +0000 UTC m=+5.681594396,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.450220 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787db5fc9d1fc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:54.668229116 +0000 UTC m=+5.681621046,LastTimestamp:2026-02-25 15:45:54.668229116 +0000 UTC m=+5.681621046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.457152 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787db5fdb8d3f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:54.669391167 +0000 UTC m=+5.682783097,LastTimestamp:2026-02-25 15:45:54.669391167 +0000 UTC m=+5.682783097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.460653 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189787db5fe17619 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:54.669778457 +0000 UTC m=+5.683170387,LastTimestamp:2026-02-25 15:45:54.669778457 +0000 UTC m=+5.683170387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.465316 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189787db5fe43399 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:54.669958041 +0000 UTC m=+5.683349971,LastTimestamp:2026-02-25 15:45:54.669958041 +0000 UTC m=+5.683349971,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.472526 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189787db64b808ec openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:54.750949612 +0000 UTC m=+5.764341512,LastTimestamp:2026-02-25 15:45:54.750949612 +0000 UTC m=+5.764341512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.476741 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787db678519ab openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:54.797943211 +0000 UTC m=+5.811335131,LastTimestamp:2026-02-25 15:45:54.797943211 +0000 UTC m=+5.811335131,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.482406 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787db67a0fc54 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:54.799770708 +0000 UTC m=+5.813162598,LastTimestamp:2026-02-25 15:45:54.799770708 +0000 UTC m=+5.813162598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.488555 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787db6a1187bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:54.84070086 +0000 UTC m=+5.854092780,LastTimestamp:2026-02-25 15:45:54.84070086 +0000 UTC m=+5.854092780,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.493044 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787db6b4cca48 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:54.861361736 +0000 UTC m=+5.874753666,LastTimestamp:2026-02-25 15:45:54.861361736 +0000 UTC m=+5.874753666,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.497064 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189787db6b7bb5c3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:54.864436675 +0000 UTC m=+5.877828595,LastTimestamp:2026-02-25 15:45:54.864436675 +0000 UTC m=+5.877828595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.500740 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787db81d21ab3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.239197363 +0000 UTC m=+6.252589293,LastTimestamp:2026-02-25 15:45:55.239197363 +0000 UTC m=+6.252589293,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.507514 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787db842e517b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.278795131 +0000 UTC m=+6.292187061,LastTimestamp:2026-02-25 15:45:55.278795131 +0000 UTC m=+6.292187061,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.511987 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787db844d5934 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.280828724 +0000 UTC m=+6.294220614,LastTimestamp:2026-02-25 15:45:55.280828724 +0000 UTC m=+6.294220614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.520105 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787db8ad9ca2f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.390695983 +0000 UTC m=+6.404087903,LastTimestamp:2026-02-25 15:45:55.390695983 +0000 UTC m=+6.404087903,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.524069 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189787db8ae4c456 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.391415382 +0000 UTC m=+6.404807272,LastTimestamp:2026-02-25 15:45:55.391415382 +0000 UTC m=+6.404807272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.530174 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189787db8ba0c551 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.403736401 +0000 UTC m=+6.417128291,LastTimestamp:2026-02-25 15:45:55.403736401 +0000 UTC m=+6.417128291,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.537291 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787db8c000bad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.409980333 +0000 UTC m=+6.423372213,LastTimestamp:2026-02-25 15:45:55.409980333 +0000 UTC m=+6.423372213,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.543379 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787db968b4171 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.586875761 +0000 UTC m=+6.600267691,LastTimestamp:2026-02-25 15:45:55.586875761 +0000 UTC m=+6.600267691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.548549 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787db9c733305 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.685962501 +0000 UTC m=+6.699354401,LastTimestamp:2026-02-25 15:45:55.685962501 +0000 UTC m=+6.699354401,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.553357 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787db9c85f479 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.687191673 +0000 UTC m=+6.700583573,LastTimestamp:2026-02-25 15:45:55.687191673 +0000 UTC m=+6.700583573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.566944 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787db9c9b3849 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.688585289 +0000 UTC m=+6.701977189,LastTimestamp:2026-02-25 15:45:55.688585289 +0000 UTC m=+6.701977189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.573914 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189787db9ca3bf3b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.689144123 +0000 UTC m=+6.702536033,LastTimestamp:2026-02-25 15:45:55.689144123 +0000 UTC m=+6.702536033,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.579990 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787db9ca575d6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.689256406 +0000 UTC m=+6.702648306,LastTimestamp:2026-02-25 15:45:55.689256406 +0000 UTC m=+6.702648306,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.587024 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189787db9cb15cc0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.690036416 +0000 UTC m=+6.703428336,LastTimestamp:2026-02-25 15:45:55.690036416 +0000 UTC m=+6.703428336,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.591004 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dba18960a6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.771302054 +0000 UTC m=+6.784693954,LastTimestamp:2026-02-25 15:45:55.771302054 +0000 UTC m=+6.784693954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.601541 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dba1e64490 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.777389712 +0000 UTC m=+6.790781602,LastTimestamp:2026-02-25 15:45:55.777389712 +0000 UTC m=+6.790781602,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.606403 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189787dba1e6a035 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.777413173 +0000 UTC m=+6.790805093,LastTimestamp:2026-02-25 15:45:55.777413173 +0000 UTC m=+6.790805093,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.611361 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189787dba1e91c97 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.777576087 +0000 UTC m=+6.790967987,LastTimestamp:2026-02-25 15:45:55.777576087 +0000 UTC m=+6.790967987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.616290 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dba1fbf53f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.778811199 +0000 UTC m=+6.792203089,LastTimestamp:2026-02-25 15:45:55.778811199 +0000 UTC m=+6.792203089,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.621004 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189787dba202a480 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.77924928 +0000 UTC m=+6.792641170,LastTimestamp:2026-02-25 15:45:55.77924928 +0000 UTC m=+6.792641170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.625681 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787dbb4f0bcb9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.096842937 +0000 UTC m=+7.110234827,LastTimestamp:2026-02-25 15:45:56.096842937 +0000 UTC m=+7.110234827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.630230 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dbb60346b4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.114835124 +0000 UTC m=+7.128227014,LastTimestamp:2026-02-25 15:45:56.114835124 +0000 UTC m=+7.128227014,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.637431 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189787dbb658a195 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.120428949 +0000 UTC m=+7.133820839,LastTimestamp:2026-02-25 15:45:56.120428949 +0000 UTC m=+7.133820839,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.642333 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787dbb6ac6eb7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.125920951 +0000 UTC m=+7.139312871,LastTimestamp:2026-02-25 15:45:56.125920951 +0000 UTC m=+7.139312871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.646377 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dbb741d90b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.135713035 +0000 UTC m=+7.149104965,LastTimestamp:2026-02-25 15:45:56.135713035 +0000 UTC m=+7.149104965,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.651041 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dbb75a90a1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.137332897 +0000 UTC m=+7.150724827,LastTimestamp:2026-02-25 15:45:56.137332897 +0000 UTC m=+7.150724827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.657205 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189787dbb8352169 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.151656809 +0000 UTC m=+7.165048729,LastTimestamp:2026-02-25 15:45:56.151656809 +0000 UTC m=+7.165048729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.663041 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189787dbb85fbabb openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.154448571 +0000 UTC m=+7.167840471,LastTimestamp:2026-02-25 15:45:56.154448571 +0000 UTC m=+7.167840471,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.672117 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dbc8935c71 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.426267761 +0000 UTC m=+7.439659691,LastTimestamp:2026-02-25 15:45:56.426267761 +0000 UTC m=+7.439659691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.676726 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189787dbcce1fa75 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.498528885 +0000 UTC m=+7.511920775,LastTimestamp:2026-02-25 15:45:56.498528885 +0000 UTC m=+7.511920775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.683826 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dbcce91af5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.498995957 +0000 UTC m=+7.512387857,LastTimestamp:2026-02-25 15:45:56.498995957 +0000 UTC m=+7.512387857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.687681 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dbd004fbc6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.55115463 +0000 UTC m=+7.564546530,LastTimestamp:2026-02-25 15:45:56.55115463 +0000 UTC m=+7.564546530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.694294 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dbd01aa170 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.552573296 +0000 UTC m=+7.565965186,LastTimestamp:2026-02-25 15:45:56.552573296 +0000 UTC m=+7.565965186,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.698768 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189787dbd077487e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.558645374 +0000 UTC m=+7.572037264,LastTimestamp:2026-02-25 15:45:56.558645374 +0000 UTC m=+7.572037264,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.703330 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dbe0181f39 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.820844345 +0000 UTC m=+7.834236235,LastTimestamp:2026-02-25 15:45:56.820844345 +0000 UTC m=+7.834236235,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.708017 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dbe238e8ac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.8565475 +0000 UTC m=+7.869939430,LastTimestamp:2026-02-25 15:45:56.8565475 +0000 UTC m=+7.869939430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.712299 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dbe44a4d84 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.89124186 +0000 UTC m=+7.904633750,LastTimestamp:2026-02-25 15:45:56.89124186 +0000 UTC m=+7.904633750,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.716412 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dbe62ade8b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.922736267 +0000 UTC m=+7.936128207,LastTimestamp:2026-02-25 15:45:56.922736267 +0000 UTC m=+7.936128207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.720717 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dbe6465744 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.924536644 +0000 UTC m=+7.937928544,LastTimestamp:2026-02-25 15:45:56.924536644 +0000 UTC m=+7.937928544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.724942 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dbf9a186c6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:57.249279686 +0000 UTC m=+8.262671616,LastTimestamp:2026-02-25 15:45:57.249279686 +0000 UTC m=+8.262671616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.729052 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dbfde3c201 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:57.320729089 +0000 UTC m=+8.334120979,LastTimestamp:2026-02-25 15:45:57.320729089 +0000 UTC m=+8.334120979,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.734111 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc087cfcbb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:57.498543291 +0000 UTC m=+8.511935211,LastTimestamp:2026-02-25 15:45:57.498543291 +0000 UTC m=+8.511935211,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.741544 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc1a12b927 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:57.793569063 +0000 UTC m=+8.806960953,LastTimestamp:2026-02-25 15:45:57.793569063 +0000 UTC m=+8.806960953,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.747594 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc1c0ded2f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:57.826809135 +0000 UTC m=+8.840201035,LastTimestamp:2026-02-25 15:45:57.826809135 +0000 UTC m=+8.840201035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.752522 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc1c20a376 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:57.828035446 +0000 UTC m=+8.841427336,LastTimestamp:2026-02-25 15:45:57.828035446 +0000 UTC m=+8.841427336,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.759861 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc2f0e54b6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:58.145602742 +0000 UTC m=+9.158994632,LastTimestamp:2026-02-25 15:45:58.145602742 +0000 UTC m=+9.158994632,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.766846 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc3159a5b6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:58.18409311 +0000 UTC m=+9.197485000,LastTimestamp:2026-02-25 15:45:58.18409311 +0000 UTC m=+9.197485000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.771651 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc316b2070 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:58.18523864 +0000 UTC m=+9.198630520,LastTimestamp:2026-02-25 15:45:58.18523864 +0000 UTC m=+9.198630520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.776912 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc407fa542 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:58.438241602 +0000 UTC m=+9.451633492,LastTimestamp:2026-02-25 15:45:58.438241602 +0000 UTC m=+9.451633492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.778442 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc42229906 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:58.465698054 +0000 UTC m=+9.479089944,LastTimestamp:2026-02-25 15:45:58.465698054 +0000 UTC m=+9.479089944,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.782924 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc42487e68 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:58.468181608 +0000 UTC m=+9.481573498,LastTimestamp:2026-02-25 15:45:58.468181608 +0000 UTC m=+9.481573498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.790533 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc65718d2e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:59.058074926 +0000 UTC m=+10.071466816,LastTimestamp:2026-02-25 15:45:59.058074926 +0000 UTC m=+10.071466816,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.795426 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc670e35ed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:59.085118957 +0000 UTC m=+10.098510887,LastTimestamp:2026-02-25 15:45:59.085118957 +0000 UTC m=+10.098510887,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.801902 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc672a1ddd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:59.086947805 +0000 UTC m=+10.100339695,LastTimestamp:2026-02-25 15:45:59.086947805 +0000 UTC m=+10.100339695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.808458 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc77e458cb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:59.367588043 +0000 UTC m=+10.380979933,LastTimestamp:2026-02-25 15:45:59.367588043 +0000 UTC m=+10.380979933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.815203 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189787dc790da342 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:59.387071298 +0000 UTC m=+10.400463188,LastTimestamp:2026-02-25 15:45:59.387071298 +0000 UTC m=+10.400463188,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.822536 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189787dbe6465744\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dbe6465744 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:56.924536644 +0000 UTC m=+7.937928544,LastTimestamp:2026-02-25 15:45:59.571341327 +0000 UTC m=+10.584733217,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.827052 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189787dbf9a186c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dbf9a186c6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:57.249279686 +0000 UTC m=+8.262671616,LastTimestamp:2026-02-25 15:45:59.783279214 +0000 UTC m=+10.796671104,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.832259 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189787dbfde3c201\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787dbfde3c201 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:57.320729089 +0000 UTC m=+8.334120979,LastTimestamp:2026-02-25 15:45:59.835748655 +0000 UTC m=+10.849140545,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.840016 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 25 15:46:30 crc kubenswrapper[4937]: &Event{ObjectMeta:{kube-controller-manager-crc.189787dec1d25f59 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 25 15:46:30 crc kubenswrapper[4937]: body: Feb 25 15:46:30 crc kubenswrapper[4937]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:46:09.197858649 +0000 UTC m=+20.211250629,LastTimestamp:2026-02-25 15:46:09.197858649 +0000 UTC m=+20.211250629,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 15:46:30 crc kubenswrapper[4937]: > Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.844183 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787dec1d40e2e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:46:09.197968942 +0000 UTC m=+20.211360902,LastTimestamp:2026-02-25 15:46:09.197968942 +0000 UTC m=+20.211360902,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.848065 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 25 15:46:30 crc kubenswrapper[4937]: &Event{ObjectMeta:{kube-apiserver-crc.189787df00216dfa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 25 15:46:30 crc kubenswrapper[4937]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 25 15:46:30 crc kubenswrapper[4937]: Feb 25 15:46:30 crc kubenswrapper[4937]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:46:10.24322713 +0000 UTC m=+21.256619020,LastTimestamp:2026-02-25 15:46:10.24322713 +0000 UTC m=+21.256619020,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 15:46:30 crc kubenswrapper[4937]: > Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.851916 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787df002209e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:46:10.243267041 +0000 UTC m=+21.256658931,LastTimestamp:2026-02-25 15:46:10.243267041 +0000 UTC m=+21.256658931,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.856806 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189787df00216dfa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 25 15:46:30 crc kubenswrapper[4937]: &Event{ObjectMeta:{kube-apiserver-crc.189787df00216dfa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 25 15:46:30 crc kubenswrapper[4937]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 25 15:46:30 crc kubenswrapper[4937]: Feb 25 15:46:30 crc kubenswrapper[4937]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:46:10.24322713 +0000 UTC m=+21.256619020,LastTimestamp:2026-02-25 15:46:10.254934624 +0000 UTC m=+21.268326534,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 15:46:30 crc kubenswrapper[4937]: > Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.861553 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189787df002209e1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189787df002209e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:46:10.243267041 +0000 UTC m=+21.256658931,LastTimestamp:2026-02-25 15:46:10.254995925 +0000 UTC m=+21.268387825,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.867771 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189787dec1d25f59\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 25 15:46:30 crc kubenswrapper[4937]: &Event{ObjectMeta:{kube-controller-manager-crc.189787dec1d25f59 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 25 15:46:30 crc kubenswrapper[4937]: body: Feb 25 15:46:30 crc kubenswrapper[4937]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:46:09.197858649 +0000 UTC m=+20.211250629,LastTimestamp:2026-02-25 15:46:19.198185254 +0000 UTC m=+30.211577184,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 15:46:30 crc kubenswrapper[4937]: > Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.871475 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189787dec1d40e2e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787dec1d40e2e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:46:09.197968942 +0000 UTC m=+20.211360902,LastTimestamp:2026-02-25 15:46:19.198259855 +0000 UTC m=+30.211651785,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.875805 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 25 15:46:30 crc kubenswrapper[4937]: &Event{ObjectMeta:{kube-controller-manager-crc.189787e314537907 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:54770->192.168.126.11:10357: read: connection reset by peer Feb 25 15:46:30 crc kubenswrapper[4937]: body: Feb 25 15:46:30 crc kubenswrapper[4937]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:46:27.761920263 +0000 UTC m=+38.775312153,LastTimestamp:2026-02-25 15:46:27.761920263 +0000 UTC m=+38.775312153,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 15:46:30 crc kubenswrapper[4937]: > Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.879936 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787e314549b6d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:54770->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:46:27.761994605 +0000 UTC m=+38.775386505,LastTimestamp:2026-02-25 15:46:27.761994605 +0000 UTC m=+38.775386505,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.884849 4937 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787e3148035af openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:46:27.764852143 +0000 UTC m=+38.778244023,LastTimestamp:2026-02-25 15:46:27.764852143 +0000 UTC m=+38.778244023,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.889020 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189787db67a0fc54\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787db67a0fc54 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:54.799770708 +0000 UTC m=+5.813162598,LastTimestamp:2026-02-25 15:46:27.783390104 +0000 UTC m=+38.796781994,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.893226 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189787db81d21ab3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787db81d21ab3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.239197363 +0000 UTC m=+6.252589293,LastTimestamp:2026-02-25 15:46:27.973186957 +0000 UTC m=+38.986578877,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:30 crc kubenswrapper[4937]: E0225 15:46:30.897456 4937 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189787db842e517b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189787db842e517b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:45:55.278795131 +0000 UTC m=+6.292187061,LastTimestamp:2026-02-25 15:46:27.984762422 +0000 UTC m=+38.998154312,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:46:31 crc kubenswrapper[4937]: I0225 15:46:31.229604 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:31 crc kubenswrapper[4937]: I0225 15:46:31.231757 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:31 crc kubenswrapper[4937]: I0225 15:46:31.231826 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:31 crc kubenswrapper[4937]: I0225 15:46:31.231846 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:31 crc kubenswrapper[4937]: I0225 15:46:31.231933 4937 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 15:46:31 crc kubenswrapper[4937]: E0225 15:46:31.237791 4937 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 15:46:31 crc kubenswrapper[4937]: E0225 15:46:31.243736 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 15:46:31 crc kubenswrapper[4937]: I0225 15:46:31.270050 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:31 crc kubenswrapper[4937]: I0225 15:46:31.633017 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:46:31 crc kubenswrapper[4937]: I0225 15:46:31.633328 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:31 crc kubenswrapper[4937]: I0225 15:46:31.635461 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:31 crc kubenswrapper[4937]: I0225 15:46:31.635526 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:31 crc kubenswrapper[4937]: I0225 15:46:31.635588 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:32 crc kubenswrapper[4937]: I0225 15:46:32.270049 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:32 crc kubenswrapper[4937]: E0225 15:46:32.545630 4937 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 15:46:33 crc kubenswrapper[4937]: I0225 15:46:33.271153 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:33 crc kubenswrapper[4937]: W0225 15:46:33.287843 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:33 crc kubenswrapper[4937]: E0225 15:46:33.287920 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 25 15:46:34 crc kubenswrapper[4937]: I0225 15:46:34.273838 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:35 crc kubenswrapper[4937]: I0225 15:46:35.273506 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:35 crc kubenswrapper[4937]: I0225 15:46:35.367213 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:35 crc kubenswrapper[4937]: I0225 15:46:35.368981 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:35 crc kubenswrapper[4937]: I0225 15:46:35.369019 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:35 crc kubenswrapper[4937]: I0225 15:46:35.369028 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:35 crc kubenswrapper[4937]: I0225 15:46:35.369610 4937 scope.go:117] "RemoveContainer" containerID="02d139956165b67982f337eb183f7ae1609bdd22fd03f70789b3745f263e29c4" Feb 25 15:46:35 crc kubenswrapper[4937]: I0225 15:46:35.651558 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 25 15:46:35 crc kubenswrapper[4937]: I0225 15:46:35.653289 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c789554662368bb9177b3f40e32538731d123eddfc72745b045a4985cb52464d"} Feb 25 15:46:35 crc kubenswrapper[4937]: I0225 15:46:35.653520 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:35 crc kubenswrapper[4937]: I0225 15:46:35.654392 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:35 crc kubenswrapper[4937]: I0225 15:46:35.654432 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:35 crc kubenswrapper[4937]: I0225 15:46:35.654442 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.197037 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.197220 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.198740 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.198779 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.198790 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.202649 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.272124 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.658177 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.659740 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.662994 4937 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c789554662368bb9177b3f40e32538731d123eddfc72745b045a4985cb52464d" exitCode=255 Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.663099 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c789554662368bb9177b3f40e32538731d123eddfc72745b045a4985cb52464d"} Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.663156 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.663171 4937 scope.go:117] "RemoveContainer" containerID="02d139956165b67982f337eb183f7ae1609bdd22fd03f70789b3745f263e29c4" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.663357 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.664181 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.664220 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.664233 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.665230 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.665264 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.665279 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:36 crc kubenswrapper[4937]: I0225 15:46:36.665982 4937 scope.go:117] "RemoveContainer" containerID="c789554662368bb9177b3f40e32538731d123eddfc72745b045a4985cb52464d" Feb 25 15:46:36 crc kubenswrapper[4937]: E0225 15:46:36.666251 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 15:46:37 crc kubenswrapper[4937]: I0225 15:46:37.272637 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:37 crc kubenswrapper[4937]: I0225 15:46:37.668304 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 25 15:46:38 crc kubenswrapper[4937]: I0225 15:46:38.238956 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:38 crc kubenswrapper[4937]: I0225 15:46:38.240297 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:38 crc kubenswrapper[4937]: I0225 15:46:38.240360 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:38 crc kubenswrapper[4937]: I0225 15:46:38.240379 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:38 crc kubenswrapper[4937]: I0225 15:46:38.240429 4937 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 15:46:38 crc kubenswrapper[4937]: E0225 15:46:38.249419 4937 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 15:46:38 crc kubenswrapper[4937]: E0225 15:46:38.249427 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 15:46:38 crc kubenswrapper[4937]: I0225 15:46:38.267127 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:39 crc kubenswrapper[4937]: I0225 15:46:39.273989 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:40 crc kubenswrapper[4937]: I0225 15:46:40.271369 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:40 crc kubenswrapper[4937]: I0225 15:46:40.342106 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:46:40 crc kubenswrapper[4937]: I0225 15:46:40.344059 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:40 crc kubenswrapper[4937]: I0225 15:46:40.346099 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:40 crc kubenswrapper[4937]: I0225 15:46:40.346132 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:40 crc kubenswrapper[4937]: I0225 15:46:40.346144 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:40 crc kubenswrapper[4937]: I0225 15:46:40.346751 4937 scope.go:117] "RemoveContainer" containerID="c789554662368bb9177b3f40e32538731d123eddfc72745b045a4985cb52464d" Feb 25 15:46:40 crc kubenswrapper[4937]: E0225 15:46:40.346920 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 15:46:40 crc kubenswrapper[4937]: I0225 15:46:40.618448 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:46:40 crc kubenswrapper[4937]: I0225 15:46:40.683044 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:40 crc kubenswrapper[4937]: I0225 15:46:40.684062 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:40 crc kubenswrapper[4937]: I0225 15:46:40.684208 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:40 crc kubenswrapper[4937]: I0225 15:46:40.684300 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:40 crc kubenswrapper[4937]: I0225 15:46:40.684955 4937 scope.go:117] "RemoveContainer" containerID="c789554662368bb9177b3f40e32538731d123eddfc72745b045a4985cb52464d" Feb 25 15:46:40 crc kubenswrapper[4937]: E0225 15:46:40.685223 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 15:46:41 crc kubenswrapper[4937]: I0225 15:46:41.270410 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:41 crc kubenswrapper[4937]: I0225 15:46:41.645033 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:46:41 crc kubenswrapper[4937]: I0225 15:46:41.645168 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:41 crc kubenswrapper[4937]: I0225 15:46:41.646084 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:41 crc kubenswrapper[4937]: I0225 15:46:41.646124 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:41 crc kubenswrapper[4937]: I0225 15:46:41.646135 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:42 crc kubenswrapper[4937]: I0225 15:46:42.273952 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:42 crc kubenswrapper[4937]: E0225 15:46:42.546661 4937 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 15:46:43 crc kubenswrapper[4937]: I0225 15:46:43.266169 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:44 crc kubenswrapper[4937]: I0225 15:46:44.270014 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:45 crc kubenswrapper[4937]: I0225 15:46:45.249564 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:45 crc kubenswrapper[4937]: I0225 15:46:45.250976 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:45 crc kubenswrapper[4937]: I0225 15:46:45.251016 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:45 crc kubenswrapper[4937]: I0225 15:46:45.251030 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:45 crc kubenswrapper[4937]: I0225 15:46:45.251055 4937 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 15:46:45 crc kubenswrapper[4937]: E0225 15:46:45.256662 4937 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 15:46:45 crc kubenswrapper[4937]: E0225 15:46:45.257216 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 15:46:45 crc kubenswrapper[4937]: I0225 15:46:45.271833 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:45 crc kubenswrapper[4937]: I0225 15:46:45.778512 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 15:46:45 crc kubenswrapper[4937]: I0225 15:46:45.778707 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:45 crc kubenswrapper[4937]: I0225 15:46:45.780058 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:45 crc kubenswrapper[4937]: I0225 15:46:45.780096 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:45 crc kubenswrapper[4937]: I0225 15:46:45.780111 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:46 crc kubenswrapper[4937]: I0225 15:46:46.270589 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:47 crc kubenswrapper[4937]: W0225 15:46:47.178754 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 25 15:46:47 crc kubenswrapper[4937]: E0225 15:46:47.178836 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 25 15:46:47 crc kubenswrapper[4937]: I0225 15:46:47.270970 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:48 crc kubenswrapper[4937]: I0225 15:46:48.271879 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:49 crc kubenswrapper[4937]: I0225 15:46:49.272300 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:50 crc kubenswrapper[4937]: I0225 15:46:50.272439 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:51 crc kubenswrapper[4937]: I0225 15:46:51.272910 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:52 crc kubenswrapper[4937]: I0225 15:46:52.257895 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:52 crc kubenswrapper[4937]: I0225 15:46:52.259884 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:52 crc kubenswrapper[4937]: I0225 15:46:52.259971 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:52 crc kubenswrapper[4937]: I0225 15:46:52.259997 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:52 crc kubenswrapper[4937]: I0225 15:46:52.260056 4937 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 15:46:52 crc kubenswrapper[4937]: E0225 15:46:52.264717 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 15:46:52 crc kubenswrapper[4937]: E0225 15:46:52.265791 4937 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 15:46:52 crc kubenswrapper[4937]: I0225 15:46:52.271796 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:52 crc kubenswrapper[4937]: E0225 15:46:52.546773 4937 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 15:46:53 crc kubenswrapper[4937]: I0225 15:46:53.271438 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:54 crc kubenswrapper[4937]: I0225 15:46:54.270434 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:55 crc kubenswrapper[4937]: W0225 15:46:55.007950 4937 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 25 15:46:55 crc kubenswrapper[4937]: E0225 15:46:55.008044 4937 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 25 15:46:55 crc kubenswrapper[4937]: I0225 15:46:55.270917 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:55 crc kubenswrapper[4937]: I0225 15:46:55.366916 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:55 crc kubenswrapper[4937]: I0225 15:46:55.369222 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:55 crc kubenswrapper[4937]: I0225 15:46:55.369302 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:55 crc kubenswrapper[4937]: I0225 15:46:55.369321 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:55 crc kubenswrapper[4937]: I0225 15:46:55.370410 4937 scope.go:117] "RemoveContainer" containerID="c789554662368bb9177b3f40e32538731d123eddfc72745b045a4985cb52464d" Feb 25 15:46:55 crc kubenswrapper[4937]: E0225 15:46:55.370667 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 15:46:56 crc kubenswrapper[4937]: I0225 15:46:56.274581 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:57 crc kubenswrapper[4937]: I0225 15:46:57.272841 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:58 crc kubenswrapper[4937]: I0225 15:46:58.276525 4937 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 15:46:58 crc kubenswrapper[4937]: I0225 15:46:58.932378 4937 csr.go:261] certificate signing request csr-ggh4l is approved, waiting to be issued Feb 25 15:46:58 crc kubenswrapper[4937]: I0225 15:46:58.952099 4937 csr.go:257] certificate signing request csr-ggh4l is issued Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.049381 4937 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.157845 4937 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.265921 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.268119 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.268179 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.268196 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.268377 4937 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.278928 4937 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.279383 4937 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 25 15:46:59 crc kubenswrapper[4937]: E0225 15:46:59.279407 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.284053 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.284106 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.284124 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.284149 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.284165 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:46:59Z","lastTransitionTime":"2026-02-25T15:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:46:59 crc kubenswrapper[4937]: E0225 15:46:59.295789 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.303805 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.303881 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.303933 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.303965 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.303984 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:46:59Z","lastTransitionTime":"2026-02-25T15:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:46:59 crc kubenswrapper[4937]: E0225 15:46:59.319083 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.328345 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.328394 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.328411 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.328433 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.328448 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:46:59Z","lastTransitionTime":"2026-02-25T15:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:46:59 crc kubenswrapper[4937]: E0225 15:46:59.342908 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.352613 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.352666 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.352686 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.352769 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.352792 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:46:59Z","lastTransitionTime":"2026-02-25T15:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:46:59 crc kubenswrapper[4937]: E0225 15:46:59.368025 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:46:59 crc kubenswrapper[4937]: E0225 15:46:59.368156 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:46:59 crc kubenswrapper[4937]: E0225 15:46:59.368192 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:46:59 crc kubenswrapper[4937]: E0225 15:46:59.468880 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:46:59 crc kubenswrapper[4937]: E0225 15:46:59.569198 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:46:59 crc kubenswrapper[4937]: E0225 15:46:59.669575 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:46:59 crc kubenswrapper[4937]: E0225 15:46:59.770407 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:46:59 crc kubenswrapper[4937]: E0225 15:46:59.870940 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.953770 4937 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-30 01:59:56.45874004 +0000 UTC Feb 25 15:46:59 crc kubenswrapper[4937]: I0225 15:46:59.953827 4937 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6658h12m56.504918855s for next certificate rotation Feb 25 15:46:59 crc kubenswrapper[4937]: E0225 15:46:59.971768 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:00 crc kubenswrapper[4937]: E0225 15:47:00.072176 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:00 crc kubenswrapper[4937]: E0225 15:47:00.172702 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:00 crc kubenswrapper[4937]: E0225 15:47:00.273104 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:00 crc kubenswrapper[4937]: E0225 15:47:00.373602 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:00 crc kubenswrapper[4937]: E0225 15:47:00.474180 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:00 crc kubenswrapper[4937]: E0225 15:47:00.574666 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:00 crc kubenswrapper[4937]: E0225 15:47:00.675459 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:00 crc kubenswrapper[4937]: E0225 15:47:00.775786 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:00 crc kubenswrapper[4937]: E0225 15:47:00.876213 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:00 crc kubenswrapper[4937]: E0225 15:47:00.976354 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:01 crc kubenswrapper[4937]: E0225 15:47:01.077404 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:01 crc kubenswrapper[4937]: E0225 15:47:01.178577 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:01 crc kubenswrapper[4937]: E0225 15:47:01.279542 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:01 crc kubenswrapper[4937]: E0225 15:47:01.379983 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:01 crc kubenswrapper[4937]: E0225 15:47:01.481103 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:01 crc kubenswrapper[4937]: E0225 15:47:01.581283 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:01 crc kubenswrapper[4937]: E0225 15:47:01.682173 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:01 crc kubenswrapper[4937]: E0225 15:47:01.782851 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:01 crc kubenswrapper[4937]: E0225 15:47:01.886408 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:01 crc kubenswrapper[4937]: E0225 15:47:01.987077 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:02 crc kubenswrapper[4937]: E0225 15:47:02.088005 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:02 crc kubenswrapper[4937]: E0225 15:47:02.189072 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:02 crc kubenswrapper[4937]: E0225 15:47:02.290059 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:02 crc kubenswrapper[4937]: E0225 15:47:02.391076 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:02 crc kubenswrapper[4937]: E0225 15:47:02.492244 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:02 crc kubenswrapper[4937]: E0225 15:47:02.546885 4937 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 15:47:02 crc kubenswrapper[4937]: E0225 15:47:02.593242 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:02 crc kubenswrapper[4937]: E0225 15:47:02.693857 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:02 crc kubenswrapper[4937]: E0225 15:47:02.794322 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:02 crc kubenswrapper[4937]: E0225 15:47:02.895309 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:02 crc kubenswrapper[4937]: E0225 15:47:02.996212 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:03 crc kubenswrapper[4937]: E0225 15:47:03.096609 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:03 crc kubenswrapper[4937]: E0225 15:47:03.196699 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:03 crc kubenswrapper[4937]: E0225 15:47:03.297204 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:03 crc kubenswrapper[4937]: E0225 15:47:03.397301 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:03 crc kubenswrapper[4937]: E0225 15:47:03.497668 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:03 crc kubenswrapper[4937]: E0225 15:47:03.597891 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:03 crc kubenswrapper[4937]: E0225 15:47:03.698639 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:03 crc kubenswrapper[4937]: E0225 15:47:03.798910 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:03 crc kubenswrapper[4937]: E0225 15:47:03.899906 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:04 crc kubenswrapper[4937]: E0225 15:47:04.000784 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:04 crc kubenswrapper[4937]: E0225 15:47:04.101565 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:04 crc kubenswrapper[4937]: E0225 15:47:04.202026 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:04 crc kubenswrapper[4937]: E0225 15:47:04.302177 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:04 crc kubenswrapper[4937]: E0225 15:47:04.402398 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:04 crc kubenswrapper[4937]: E0225 15:47:04.503032 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:04 crc kubenswrapper[4937]: E0225 15:47:04.603146 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:04 crc kubenswrapper[4937]: E0225 15:47:04.704064 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:04 crc kubenswrapper[4937]: E0225 15:47:04.804651 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:04 crc kubenswrapper[4937]: E0225 15:47:04.905071 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:05 crc kubenswrapper[4937]: E0225 15:47:05.006058 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:05 crc kubenswrapper[4937]: E0225 15:47:05.107014 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:05 crc kubenswrapper[4937]: E0225 15:47:05.207996 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:05 crc kubenswrapper[4937]: E0225 15:47:05.308518 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:05 crc kubenswrapper[4937]: E0225 15:47:05.409595 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:05 crc kubenswrapper[4937]: E0225 15:47:05.510500 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:05 crc kubenswrapper[4937]: E0225 15:47:05.610910 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:05 crc kubenswrapper[4937]: E0225 15:47:05.711579 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:05 crc kubenswrapper[4937]: E0225 15:47:05.811963 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:05 crc kubenswrapper[4937]: E0225 15:47:05.912052 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:06 crc kubenswrapper[4937]: E0225 15:47:06.012528 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:06 crc kubenswrapper[4937]: E0225 15:47:06.113035 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:06 crc kubenswrapper[4937]: E0225 15:47:06.213555 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:06 crc kubenswrapper[4937]: E0225 15:47:06.314643 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:06 crc kubenswrapper[4937]: I0225 15:47:06.366693 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:47:06 crc kubenswrapper[4937]: I0225 15:47:06.368742 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:06 crc kubenswrapper[4937]: I0225 15:47:06.368785 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:06 crc kubenswrapper[4937]: I0225 15:47:06.368797 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:06 crc kubenswrapper[4937]: I0225 15:47:06.369333 4937 scope.go:117] "RemoveContainer" containerID="c789554662368bb9177b3f40e32538731d123eddfc72745b045a4985cb52464d" Feb 25 15:47:06 crc kubenswrapper[4937]: E0225 15:47:06.415297 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:06 crc kubenswrapper[4937]: E0225 15:47:06.516420 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:06 crc kubenswrapper[4937]: E0225 15:47:06.617505 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:06 crc kubenswrapper[4937]: E0225 15:47:06.717811 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:06 crc kubenswrapper[4937]: E0225 15:47:06.817913 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:06 crc kubenswrapper[4937]: E0225 15:47:06.918206 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:07 crc kubenswrapper[4937]: E0225 15:47:07.018958 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:07 crc kubenswrapper[4937]: E0225 15:47:07.119525 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:07 crc kubenswrapper[4937]: E0225 15:47:07.220618 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:07 crc kubenswrapper[4937]: E0225 15:47:07.321360 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:07 crc kubenswrapper[4937]: E0225 15:47:07.421852 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:07 crc kubenswrapper[4937]: E0225 15:47:07.523633 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:07 crc kubenswrapper[4937]: E0225 15:47:07.624586 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:07 crc kubenswrapper[4937]: E0225 15:47:07.725181 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:07 crc kubenswrapper[4937]: I0225 15:47:07.751402 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 15:47:07 crc kubenswrapper[4937]: I0225 15:47:07.752005 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 25 15:47:07 crc kubenswrapper[4937]: I0225 15:47:07.754360 4937 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527" exitCode=255 Feb 25 15:47:07 crc kubenswrapper[4937]: I0225 15:47:07.754403 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527"} Feb 25 15:47:07 crc kubenswrapper[4937]: I0225 15:47:07.754462 4937 scope.go:117] "RemoveContainer" containerID="c789554662368bb9177b3f40e32538731d123eddfc72745b045a4985cb52464d" Feb 25 15:47:07 crc kubenswrapper[4937]: I0225 15:47:07.754603 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:47:07 crc kubenswrapper[4937]: I0225 15:47:07.755729 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:07 crc kubenswrapper[4937]: I0225 15:47:07.755761 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:07 crc kubenswrapper[4937]: I0225 15:47:07.755777 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:07 crc kubenswrapper[4937]: I0225 15:47:07.756548 4937 scope.go:117] "RemoveContainer" containerID="74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527" Feb 25 15:47:07 crc kubenswrapper[4937]: E0225 15:47:07.756750 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 15:47:07 crc kubenswrapper[4937]: E0225 15:47:07.826159 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:07 crc kubenswrapper[4937]: E0225 15:47:07.926736 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:08 crc kubenswrapper[4937]: E0225 15:47:08.027009 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:08 crc kubenswrapper[4937]: E0225 15:47:08.127837 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:08 crc kubenswrapper[4937]: E0225 15:47:08.228297 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:08 crc kubenswrapper[4937]: E0225 15:47:08.328970 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:08 crc kubenswrapper[4937]: E0225 15:47:08.429555 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:08 crc kubenswrapper[4937]: E0225 15:47:08.530141 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:08 crc kubenswrapper[4937]: E0225 15:47:08.630540 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:08 crc kubenswrapper[4937]: E0225 15:47:08.730871 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:08 crc kubenswrapper[4937]: I0225 15:47:08.757842 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 15:47:08 crc kubenswrapper[4937]: E0225 15:47:08.831500 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:08 crc kubenswrapper[4937]: E0225 15:47:08.931803 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.032619 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.133413 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.234519 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.334956 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.435071 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.441230 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.452604 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.452658 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.452675 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.452699 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.452716 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:09Z","lastTransitionTime":"2026-02-25T15:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.468552 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.479599 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.479639 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.479650 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.479667 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.479678 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:09Z","lastTransitionTime":"2026-02-25T15:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.492734 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.500391 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.500446 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.500464 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.500519 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.500541 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:09Z","lastTransitionTime":"2026-02-25T15:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.512701 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.527285 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.527323 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.527334 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.527350 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:09 crc kubenswrapper[4937]: I0225 15:47:09.527362 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:09Z","lastTransitionTime":"2026-02-25T15:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.544256 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.544613 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.544655 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.644784 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.746036 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.846373 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:09 crc kubenswrapper[4937]: E0225 15:47:09.946944 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:10 crc kubenswrapper[4937]: E0225 15:47:10.047752 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:10 crc kubenswrapper[4937]: E0225 15:47:10.148578 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:10 crc kubenswrapper[4937]: E0225 15:47:10.249091 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:10 crc kubenswrapper[4937]: I0225 15:47:10.342297 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:47:10 crc kubenswrapper[4937]: I0225 15:47:10.342537 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:47:10 crc kubenswrapper[4937]: I0225 15:47:10.344078 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:10 crc kubenswrapper[4937]: I0225 15:47:10.344140 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:10 crc kubenswrapper[4937]: I0225 15:47:10.344163 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:10 crc kubenswrapper[4937]: I0225 15:47:10.345317 4937 scope.go:117] "RemoveContainer" containerID="74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527" Feb 25 15:47:10 crc kubenswrapper[4937]: E0225 15:47:10.345695 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 15:47:10 crc kubenswrapper[4937]: E0225 15:47:10.350169 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:10 crc kubenswrapper[4937]: E0225 15:47:10.450308 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:10 crc kubenswrapper[4937]: E0225 15:47:10.550418 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:10 crc kubenswrapper[4937]: I0225 15:47:10.618890 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:47:10 crc kubenswrapper[4937]: E0225 15:47:10.651220 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:10 crc kubenswrapper[4937]: E0225 15:47:10.751577 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:10 crc kubenswrapper[4937]: I0225 15:47:10.765008 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:47:10 crc kubenswrapper[4937]: I0225 15:47:10.765952 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:10 crc kubenswrapper[4937]: I0225 15:47:10.766007 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:10 crc kubenswrapper[4937]: I0225 15:47:10.766016 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:10 crc kubenswrapper[4937]: I0225 15:47:10.766811 4937 scope.go:117] "RemoveContainer" containerID="74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527" Feb 25 15:47:10 crc kubenswrapper[4937]: E0225 15:47:10.767009 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 15:47:10 crc kubenswrapper[4937]: E0225 15:47:10.852625 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:10 crc kubenswrapper[4937]: E0225 15:47:10.953270 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:11 crc kubenswrapper[4937]: E0225 15:47:11.053733 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:11 crc kubenswrapper[4937]: E0225 15:47:11.154768 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:11 crc kubenswrapper[4937]: E0225 15:47:11.255934 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:11 crc kubenswrapper[4937]: E0225 15:47:11.357031 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:11 crc kubenswrapper[4937]: E0225 15:47:11.457345 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:11 crc kubenswrapper[4937]: E0225 15:47:11.558824 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:11 crc kubenswrapper[4937]: E0225 15:47:11.659803 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:11 crc kubenswrapper[4937]: E0225 15:47:11.760686 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:11 crc kubenswrapper[4937]: E0225 15:47:11.861195 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:11 crc kubenswrapper[4937]: E0225 15:47:11.962330 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:12 crc kubenswrapper[4937]: E0225 15:47:12.062728 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:12 crc kubenswrapper[4937]: E0225 15:47:12.163857 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:12 crc kubenswrapper[4937]: E0225 15:47:12.264067 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:12 crc kubenswrapper[4937]: E0225 15:47:12.364916 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:12 crc kubenswrapper[4937]: E0225 15:47:12.466136 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:12 crc kubenswrapper[4937]: E0225 15:47:12.546992 4937 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 15:47:12 crc kubenswrapper[4937]: E0225 15:47:12.566814 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:12 crc kubenswrapper[4937]: E0225 15:47:12.667384 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:12 crc kubenswrapper[4937]: E0225 15:47:12.768053 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:12 crc kubenswrapper[4937]: E0225 15:47:12.868356 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:12 crc kubenswrapper[4937]: E0225 15:47:12.969298 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:13 crc kubenswrapper[4937]: E0225 15:47:13.069656 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:13 crc kubenswrapper[4937]: E0225 15:47:13.170647 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:13 crc kubenswrapper[4937]: E0225 15:47:13.271650 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:13 crc kubenswrapper[4937]: E0225 15:47:13.371866 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:13 crc kubenswrapper[4937]: E0225 15:47:13.472748 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:13 crc kubenswrapper[4937]: I0225 15:47:13.557353 4937 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 25 15:47:13 crc kubenswrapper[4937]: E0225 15:47:13.573930 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:13 crc kubenswrapper[4937]: E0225 15:47:13.674736 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:13 crc kubenswrapper[4937]: E0225 15:47:13.775079 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:13 crc kubenswrapper[4937]: E0225 15:47:13.875349 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:13 crc kubenswrapper[4937]: E0225 15:47:13.975530 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:14 crc kubenswrapper[4937]: E0225 15:47:14.075955 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:14 crc kubenswrapper[4937]: E0225 15:47:14.176096 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:14 crc kubenswrapper[4937]: E0225 15:47:14.276653 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:14 crc kubenswrapper[4937]: E0225 15:47:14.376841 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:14 crc kubenswrapper[4937]: E0225 15:47:14.477204 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:14 crc kubenswrapper[4937]: E0225 15:47:14.578302 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:14 crc kubenswrapper[4937]: E0225 15:47:14.678875 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:14 crc kubenswrapper[4937]: E0225 15:47:14.779561 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:14 crc kubenswrapper[4937]: E0225 15:47:14.879758 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:14 crc kubenswrapper[4937]: E0225 15:47:14.980641 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:15 crc kubenswrapper[4937]: E0225 15:47:15.081750 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:15 crc kubenswrapper[4937]: E0225 15:47:15.182235 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:15 crc kubenswrapper[4937]: E0225 15:47:15.283252 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:15 crc kubenswrapper[4937]: I0225 15:47:15.376804 4937 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 25 15:47:15 crc kubenswrapper[4937]: E0225 15:47:15.384113 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:15 crc kubenswrapper[4937]: I0225 15:47:15.442294 4937 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 25 15:47:15 crc kubenswrapper[4937]: E0225 15:47:15.484978 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:15 crc kubenswrapper[4937]: E0225 15:47:15.585420 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:15 crc kubenswrapper[4937]: E0225 15:47:15.686048 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:15 crc kubenswrapper[4937]: E0225 15:47:15.786881 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:15 crc kubenswrapper[4937]: E0225 15:47:15.887077 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:15 crc kubenswrapper[4937]: E0225 15:47:15.987322 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:16 crc kubenswrapper[4937]: E0225 15:47:16.088026 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:16 crc kubenswrapper[4937]: E0225 15:47:16.189145 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:16 crc kubenswrapper[4937]: E0225 15:47:16.290191 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:16 crc kubenswrapper[4937]: E0225 15:47:16.390893 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:16 crc kubenswrapper[4937]: E0225 15:47:16.491947 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:16 crc kubenswrapper[4937]: E0225 15:47:16.592318 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:16 crc kubenswrapper[4937]: E0225 15:47:16.692962 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:16 crc kubenswrapper[4937]: E0225 15:47:16.793679 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:16 crc kubenswrapper[4937]: E0225 15:47:16.894768 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:16 crc kubenswrapper[4937]: E0225 15:47:16.995669 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:17 crc kubenswrapper[4937]: E0225 15:47:17.096307 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:17 crc kubenswrapper[4937]: E0225 15:47:17.196930 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:17 crc kubenswrapper[4937]: E0225 15:47:17.297654 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:17 crc kubenswrapper[4937]: E0225 15:47:17.398465 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:17 crc kubenswrapper[4937]: E0225 15:47:17.498910 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:17 crc kubenswrapper[4937]: E0225 15:47:17.599731 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:17 crc kubenswrapper[4937]: E0225 15:47:17.700387 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:17 crc kubenswrapper[4937]: E0225 15:47:17.801376 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:17 crc kubenswrapper[4937]: E0225 15:47:17.901642 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:18 crc kubenswrapper[4937]: E0225 15:47:18.002208 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:18 crc kubenswrapper[4937]: E0225 15:47:18.102675 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:18 crc kubenswrapper[4937]: E0225 15:47:18.202947 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:18 crc kubenswrapper[4937]: E0225 15:47:18.303984 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:18 crc kubenswrapper[4937]: E0225 15:47:18.404909 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:18 crc kubenswrapper[4937]: E0225 15:47:18.505656 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:18 crc kubenswrapper[4937]: E0225 15:47:18.606520 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:18 crc kubenswrapper[4937]: E0225 15:47:18.707132 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:18 crc kubenswrapper[4937]: E0225 15:47:18.808165 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:18 crc kubenswrapper[4937]: E0225 15:47:18.909315 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.010568 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.110728 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.211666 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.312239 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.413217 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.513646 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.614146 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.715199 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.728525 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.733963 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.734024 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.734035 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.734055 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.734066 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:19Z","lastTransitionTime":"2026-02-25T15:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.747173 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.751936 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.751993 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.752010 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.752029 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.752042 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:19Z","lastTransitionTime":"2026-02-25T15:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.765540 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.770318 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.770369 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.770385 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.770407 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.770421 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:19Z","lastTransitionTime":"2026-02-25T15:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.781782 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.785714 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.785746 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.785753 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.785769 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:19 crc kubenswrapper[4937]: I0225 15:47:19.785779 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:19Z","lastTransitionTime":"2026-02-25T15:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.797608 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.797817 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.815624 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:19 crc kubenswrapper[4937]: E0225 15:47:19.916767 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:20 crc kubenswrapper[4937]: E0225 15:47:20.017954 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:20 crc kubenswrapper[4937]: E0225 15:47:20.119150 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:20 crc kubenswrapper[4937]: E0225 15:47:20.220037 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:20 crc kubenswrapper[4937]: E0225 15:47:20.320569 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:20 crc kubenswrapper[4937]: E0225 15:47:20.421311 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:20 crc kubenswrapper[4937]: E0225 15:47:20.521650 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:20 crc kubenswrapper[4937]: E0225 15:47:20.621927 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:20 crc kubenswrapper[4937]: E0225 15:47:20.722083 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:20 crc kubenswrapper[4937]: E0225 15:47:20.822926 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:20 crc kubenswrapper[4937]: E0225 15:47:20.923875 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:21 crc kubenswrapper[4937]: E0225 15:47:21.024477 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:21 crc kubenswrapper[4937]: E0225 15:47:21.126012 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:21 crc kubenswrapper[4937]: E0225 15:47:21.227014 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:21 crc kubenswrapper[4937]: E0225 15:47:21.327891 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:21 crc kubenswrapper[4937]: I0225 15:47:21.366896 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:47:21 crc kubenswrapper[4937]: I0225 15:47:21.368405 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:21 crc kubenswrapper[4937]: I0225 15:47:21.368464 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:21 crc kubenswrapper[4937]: I0225 15:47:21.368525 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:21 crc kubenswrapper[4937]: E0225 15:47:21.429041 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:21 crc kubenswrapper[4937]: E0225 15:47:21.530196 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:21 crc kubenswrapper[4937]: E0225 15:47:21.630599 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:21 crc kubenswrapper[4937]: E0225 15:47:21.730963 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:21 crc kubenswrapper[4937]: E0225 15:47:21.831827 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:21 crc kubenswrapper[4937]: E0225 15:47:21.932579 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:22 crc kubenswrapper[4937]: E0225 15:47:22.033310 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:22 crc kubenswrapper[4937]: E0225 15:47:22.133837 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:22 crc kubenswrapper[4937]: E0225 15:47:22.234189 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:22 crc kubenswrapper[4937]: E0225 15:47:22.334367 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:22 crc kubenswrapper[4937]: E0225 15:47:22.435271 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:22 crc kubenswrapper[4937]: E0225 15:47:22.535891 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:22 crc kubenswrapper[4937]: E0225 15:47:22.547132 4937 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 15:47:22 crc kubenswrapper[4937]: E0225 15:47:22.636564 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:22 crc kubenswrapper[4937]: E0225 15:47:22.737755 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:22 crc kubenswrapper[4937]: E0225 15:47:22.838610 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:22 crc kubenswrapper[4937]: E0225 15:47:22.939400 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:23 crc kubenswrapper[4937]: E0225 15:47:23.039944 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:23 crc kubenswrapper[4937]: E0225 15:47:23.140359 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:23 crc kubenswrapper[4937]: E0225 15:47:23.240854 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:23 crc kubenswrapper[4937]: E0225 15:47:23.341379 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:23 crc kubenswrapper[4937]: I0225 15:47:23.366989 4937 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 15:47:23 crc kubenswrapper[4937]: I0225 15:47:23.368476 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:23 crc kubenswrapper[4937]: I0225 15:47:23.368535 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:23 crc kubenswrapper[4937]: I0225 15:47:23.368547 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:23 crc kubenswrapper[4937]: I0225 15:47:23.369140 4937 scope.go:117] "RemoveContainer" containerID="74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527" Feb 25 15:47:23 crc kubenswrapper[4937]: E0225 15:47:23.369321 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 15:47:23 crc kubenswrapper[4937]: E0225 15:47:23.441805 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:23 crc kubenswrapper[4937]: E0225 15:47:23.542265 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:23 crc kubenswrapper[4937]: E0225 15:47:23.642470 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:23 crc kubenswrapper[4937]: E0225 15:47:23.743439 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:23 crc kubenswrapper[4937]: E0225 15:47:23.844581 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:23 crc kubenswrapper[4937]: E0225 15:47:23.944951 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:24 crc kubenswrapper[4937]: E0225 15:47:24.045529 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:24 crc kubenswrapper[4937]: E0225 15:47:24.145640 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:24 crc kubenswrapper[4937]: E0225 15:47:24.246499 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:24 crc kubenswrapper[4937]: E0225 15:47:24.346999 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:24 crc kubenswrapper[4937]: E0225 15:47:24.447198 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:24 crc kubenswrapper[4937]: E0225 15:47:24.548007 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:24 crc kubenswrapper[4937]: E0225 15:47:24.648170 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:24 crc kubenswrapper[4937]: E0225 15:47:24.749582 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:24 crc kubenswrapper[4937]: E0225 15:47:24.850357 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:24 crc kubenswrapper[4937]: E0225 15:47:24.951071 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:25 crc kubenswrapper[4937]: E0225 15:47:25.051918 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:25 crc kubenswrapper[4937]: E0225 15:47:25.152204 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:25 crc kubenswrapper[4937]: E0225 15:47:25.253460 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:25 crc kubenswrapper[4937]: E0225 15:47:25.353616 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:25 crc kubenswrapper[4937]: E0225 15:47:25.453727 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:25 crc kubenswrapper[4937]: E0225 15:47:25.553871 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:25 crc kubenswrapper[4937]: E0225 15:47:25.654094 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:25 crc kubenswrapper[4937]: E0225 15:47:25.754942 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:25 crc kubenswrapper[4937]: E0225 15:47:25.856219 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:25 crc kubenswrapper[4937]: E0225 15:47:25.957164 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:26 crc kubenswrapper[4937]: E0225 15:47:26.058342 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:26 crc kubenswrapper[4937]: E0225 15:47:26.159541 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:26 crc kubenswrapper[4937]: E0225 15:47:26.259955 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:26 crc kubenswrapper[4937]: E0225 15:47:26.361425 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:26 crc kubenswrapper[4937]: E0225 15:47:26.462080 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:26 crc kubenswrapper[4937]: E0225 15:47:26.562556 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:26 crc kubenswrapper[4937]: E0225 15:47:26.663529 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:26 crc kubenswrapper[4937]: E0225 15:47:26.764411 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:26 crc kubenswrapper[4937]: E0225 15:47:26.865421 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:26 crc kubenswrapper[4937]: E0225 15:47:26.965988 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:27 crc kubenswrapper[4937]: E0225 15:47:27.066928 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:27 crc kubenswrapper[4937]: E0225 15:47:27.167012 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:27 crc kubenswrapper[4937]: E0225 15:47:27.267982 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:27 crc kubenswrapper[4937]: E0225 15:47:27.369622 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:27 crc kubenswrapper[4937]: E0225 15:47:27.469816 4937 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.545680 4937 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.573580 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.574434 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.574633 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.574797 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.574936 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:27Z","lastTransitionTime":"2026-02-25T15:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.678253 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.678323 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.678352 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.678384 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.678410 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:27Z","lastTransitionTime":"2026-02-25T15:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.781394 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.781451 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.781468 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.781518 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.781536 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:27Z","lastTransitionTime":"2026-02-25T15:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.884610 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.884664 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.884675 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.884693 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.884705 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:27Z","lastTransitionTime":"2026-02-25T15:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.987116 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.987182 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.987204 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.987241 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:27 crc kubenswrapper[4937]: I0225 15:47:27.987260 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:27Z","lastTransitionTime":"2026-02-25T15:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.024304 4937 apiserver.go:52] "Watching apiserver" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.035272 4937 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.035684 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.036061 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.036115 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.036127 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.036658 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.036723 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.037610 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.038000 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.038052 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.037639 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.038804 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.039073 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.039909 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.039906 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.040437 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.039956 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.040879 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.040923 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.041368 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.065671 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.069328 4937 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.086003 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.090739 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.090959 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.091141 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.091314 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.091556 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:28Z","lastTransitionTime":"2026-02-25T15:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.097521 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.108056 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.117218 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.129157 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.131535 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.131673 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.131772 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.131929 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.132038 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.132146 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.132040 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.132324 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.132282 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.132544 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.132666 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.132836 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.132884 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.132925 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133180 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133129 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133068 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133281 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133324 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133362 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133394 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133424 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133459 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133523 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133555 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133583 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133612 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133643 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133672 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133706 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133736 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133770 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133803 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133833 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133872 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133904 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133933 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133963 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133319 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133993 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133326 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133748 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133804 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134027 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.133965 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134058 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134089 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134121 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134152 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134184 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134217 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134248 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134279 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134311 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134341 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134374 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134404 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134439 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134474 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134526 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134558 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134589 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134619 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134651 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134683 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134716 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134750 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134779 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134808 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134838 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134867 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134897 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134926 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134957 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134989 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135132 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135167 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135200 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135231 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135264 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135298 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135331 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135367 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135397 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135429 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135460 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135517 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135549 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135580 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135610 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135642 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135674 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135708 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135743 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135797 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135838 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135869 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135899 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135931 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135963 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135994 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136025 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136055 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136087 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136118 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136152 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136187 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136220 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136253 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136285 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136319 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136355 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136390 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136420 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136452 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136508 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136541 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136574 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136606 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136640 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136676 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136710 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136743 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136777 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136811 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136846 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136879 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136912 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136943 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136977 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.137022 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.137073 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.137122 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.138682 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.138738 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.138776 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.138809 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.138843 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.138877 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.138909 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.138940 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.139001 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144139 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144226 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144274 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144315 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144360 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144405 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144443 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144536 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144586 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144631 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144672 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144718 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144763 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144802 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144845 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144888 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144928 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144974 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.145063 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.145791 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.145837 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.145878 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134184 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134262 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134465 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134583 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134928 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.134947 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135219 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135269 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.135702 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136135 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136335 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136607 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136806 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.136892 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.138620 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.138683 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.138936 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.139042 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.139094 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.139389 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.139507 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.139828 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.139881 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.146898 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.146979 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.140008 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.140195 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.140402 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.140568 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.140595 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.140605 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.140744 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.140827 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.140819 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.140845 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141050 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141108 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141185 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141024 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141305 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141414 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141403 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141735 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141789 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141887 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141884 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141918 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141930 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141951 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141967 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.142155 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.142250 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.142370 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.142407 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.142439 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.142458 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.142663 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.142967 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.142972 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.143152 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.143153 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.143196 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.143303 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.141293 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.143314 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.143320 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.143327 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.143448 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.143705 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.143376 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144011 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144105 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144638 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144698 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144868 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144923 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.144959 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.145424 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148114 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.145713 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.145727 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.145773 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.145868 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.146038 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.146168 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.146191 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.146342 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:47:28.646283627 +0000 UTC m=+99.659675517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148275 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148305 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148313 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148402 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.147316 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.147761 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148711 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.147833 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148619 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148639 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148834 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148412 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148658 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148770 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148885 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148924 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148880 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.147059 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148889 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.148919 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.149268 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.149509 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.149718 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.149791 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.149804 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.149836 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.150079 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.150091 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.150216 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.150293 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151051 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151083 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151102 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151198 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151246 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151281 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151306 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151327 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151397 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151670 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151441 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151439 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151698 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151660 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151521 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151841 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151875 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151922 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151944 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151966 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.151992 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152013 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152035 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152037 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152075 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152120 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152147 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152168 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152083 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152186 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152406 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152451 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152472 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152745 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152799 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152897 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.152980 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.153016 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.153041 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157623 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157653 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157676 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157699 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157721 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157746 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157772 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157793 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157818 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157840 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157862 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157885 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157910 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157932 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157955 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.157977 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158004 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158028 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158049 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.153376 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158092 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158120 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158146 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158167 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158189 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158214 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158243 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158266 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158293 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158317 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158342 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158368 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158393 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158458 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158551 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158567 4937 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158581 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158596 4937 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158774 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158791 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158804 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158816 4937 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158829 4937 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158844 4937 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158856 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158870 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158883 4937 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158896 4937 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159027 4937 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159044 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159056 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159071 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159083 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159096 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159108 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159121 4937 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159134 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159149 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159163 4937 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159175 4937 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159187 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159200 4937 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159212 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159225 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159240 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159252 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159263 4937 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159277 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159289 4937 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159301 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159314 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159327 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159339 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159352 4937 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159365 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159377 4937 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159389 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159401 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159415 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159428 4937 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159440 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159507 4937 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159521 4937 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159535 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159549 4937 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159564 4937 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159577 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159589 4937 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159602 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159614 4937 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159626 4937 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159639 4937 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159651 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159663 4937 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159675 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159687 4937 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159699 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159712 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159723 4937 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159735 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159747 4937 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159760 4937 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159772 4937 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159787 4937 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159799 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159811 4937 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159823 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160553 4937 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160570 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160583 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160596 4937 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160608 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160620 4937 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160632 4937 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160644 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160660 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160672 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160685 4937 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160697 4937 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160708 4937 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160721 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160733 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160745 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160757 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160769 4937 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160781 4937 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160793 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160805 4937 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160817 4937 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160829 4937 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160841 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160853 4937 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160865 4937 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160876 4937 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160888 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160901 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160914 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160926 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160939 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160955 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160967 4937 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160979 4937 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160991 4937 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161002 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161015 4937 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161027 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161040 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161058 4937 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161070 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161083 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161096 4937 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161109 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161122 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161134 4937 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161146 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161160 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161182 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161206 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161223 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161236 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161251 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161264 4937 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161275 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161286 4937 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161299 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161313 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161324 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161337 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161287 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161349 4937 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161399 4937 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161427 4937 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161447 4937 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161468 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161521 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161549 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161574 4937 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.161594 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.153255 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.153439 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.153747 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.153948 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.154081 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.154226 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.154404 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.154532 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158074 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158328 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158635 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.158869 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.159230 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.160009 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.162502 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.162666 4937 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.162973 4937 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.163005 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.163039 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.163207 4937 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.163464 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.163694 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.163696 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.163721 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:28.663697795 +0000 UTC m=+99.677089695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.163756 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:28.663744096 +0000 UTC m=+99.677135996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.163849 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.163932 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.163944 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.164449 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.164517 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.164615 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.164788 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.165040 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.165305 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.165301 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.165437 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.165625 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.165650 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.165709 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.165820 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.165904 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.165917 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.166158 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.166544 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.168062 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.172750 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.174111 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.174542 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.174554 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.174604 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.176605 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.177767 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.177780 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.177869 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.178203 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.181691 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.181720 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.181735 4937 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.181801 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:28.68178124 +0000 UTC m=+99.695173140 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.181658 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.182111 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.182214 4937 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.182416 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:28.682368695 +0000 UTC m=+99.695760705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.184025 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.184629 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.185634 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.185911 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.187370 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.193942 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.193982 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.193994 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.194011 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.194023 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:28Z","lastTransitionTime":"2026-02-25T15:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.195605 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.205321 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.211567 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.217056 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.262684 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.262785 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.262824 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.262839 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.262852 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.262851 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.262866 4937 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.262932 4937 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.262947 4937 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.262960 4937 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.262974 4937 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.262988 4937 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263000 4937 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263006 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263012 4937 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263048 4937 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263059 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263071 4937 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263085 4937 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263095 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263110 4937 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263122 4937 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263133 4937 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263144 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263155 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263167 4937 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263178 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263189 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263202 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263213 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263224 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263236 4937 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263247 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263259 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263270 4937 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263282 4937 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263293 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263305 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263315 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263326 4937 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263339 4937 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263350 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263362 4937 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263375 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263387 4937 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263398 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263411 4937 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263423 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263435 4937 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263449 4937 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263460 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263472 4937 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263526 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263543 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.263559 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.296206 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.296252 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.296264 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.296281 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.296297 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:28Z","lastTransitionTime":"2026-02-25T15:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.354088 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.360211 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.369204 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.398385 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.398464 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.398532 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.398578 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.398600 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:28Z","lastTransitionTime":"2026-02-25T15:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.501864 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.502290 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.502310 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.502346 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.502360 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:28Z","lastTransitionTime":"2026-02-25T15:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.604450 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.604517 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.604536 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.604558 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.604573 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:28Z","lastTransitionTime":"2026-02-25T15:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.666257 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.666340 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.666361 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.666463 4937 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.666526 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:29.666513338 +0000 UTC m=+100.679905228 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.666909 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:47:29.666900508 +0000 UTC m=+100.680292398 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.667009 4937 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.667119 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:29.667092773 +0000 UTC m=+100.680484703 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.707114 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.707165 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.707182 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.707205 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.707223 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:28Z","lastTransitionTime":"2026-02-25T15:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.767270 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.767370 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.767565 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.767594 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.767612 4937 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.767677 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:29.767656885 +0000 UTC m=+100.781048815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.768062 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.768110 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.768135 4937 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:28 crc kubenswrapper[4937]: E0225 15:47:28.768220 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:29.768199919 +0000 UTC m=+100.781591819 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.809955 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.810005 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.810022 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.810045 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.810063 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:28Z","lastTransitionTime":"2026-02-25T15:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.813844 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ab0a6d5eb2594463b7c0371e3d4598d7f6772e807a3f6306bb5419e1b90fa296"} Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.815276 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c007b8f31436c993fdd67c68ca46c98bf13dd5eeba7a81f1d8bee4e8395370fb"} Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.816467 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ac3e6ddabb2b127d5fd1f78222ded48eb975b21773cb6511c74604d6bffe1917"} Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.912894 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.912926 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.912934 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.912949 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:28 crc kubenswrapper[4937]: I0225 15:47:28.912963 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:28Z","lastTransitionTime":"2026-02-25T15:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.016226 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.016271 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.016287 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.016310 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.016327 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:29Z","lastTransitionTime":"2026-02-25T15:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.119755 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.119842 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.119866 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.119891 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.119909 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:29Z","lastTransitionTime":"2026-02-25T15:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.223116 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.223183 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.223206 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.223235 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.223256 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:29Z","lastTransitionTime":"2026-02-25T15:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.325721 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.325950 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.326010 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.326087 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.326183 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:29Z","lastTransitionTime":"2026-02-25T15:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.367602 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:29 crc kubenswrapper[4937]: E0225 15:47:29.367731 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.378321 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.379187 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.380418 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.381293 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.382420 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.383012 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.383965 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.385405 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.386063 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.387205 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.387855 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.389334 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.389862 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.390377 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.391401 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.391985 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.393094 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.393562 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.394246 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.395459 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.395954 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.397090 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.397543 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.398712 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.399133 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.399757 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.400863 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.401371 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.402576 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.403077 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.404248 4937 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.404368 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.407391 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.408607 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.409936 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.411572 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.412245 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.413256 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.414169 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.416966 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.418316 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.420717 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.422826 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.424515 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.426293 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.427534 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.429339 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.429405 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.429423 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.429450 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.429461 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.429469 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:29Z","lastTransitionTime":"2026-02-25T15:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.431355 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.433179 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.434144 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.435109 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.437109 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.438307 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.440550 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.532284 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.532364 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.532386 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.532416 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.532438 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:29Z","lastTransitionTime":"2026-02-25T15:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.636585 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.636653 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.636671 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.636696 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.636712 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:29Z","lastTransitionTime":"2026-02-25T15:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.678145 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.678314 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.678384 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:29 crc kubenswrapper[4937]: E0225 15:47:29.678514 4937 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:47:29 crc kubenswrapper[4937]: E0225 15:47:29.678592 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:31.678569666 +0000 UTC m=+102.691961586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:47:29 crc kubenswrapper[4937]: E0225 15:47:29.678656 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:47:31.678611667 +0000 UTC m=+102.692003597 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:47:29 crc kubenswrapper[4937]: E0225 15:47:29.678830 4937 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:47:29 crc kubenswrapper[4937]: E0225 15:47:29.678935 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:31.678909694 +0000 UTC m=+102.692301644 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.740406 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.740552 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.740585 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.740612 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.740631 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:29Z","lastTransitionTime":"2026-02-25T15:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.779122 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.779228 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:29 crc kubenswrapper[4937]: E0225 15:47:29.779446 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:47:29 crc kubenswrapper[4937]: E0225 15:47:29.779475 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:47:29 crc kubenswrapper[4937]: E0225 15:47:29.779531 4937 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:29 crc kubenswrapper[4937]: E0225 15:47:29.779545 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:47:29 crc kubenswrapper[4937]: E0225 15:47:29.779613 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:47:29 crc kubenswrapper[4937]: E0225 15:47:29.779634 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:31.77960172 +0000 UTC m=+102.792993650 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:29 crc kubenswrapper[4937]: E0225 15:47:29.779640 4937 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:29 crc kubenswrapper[4937]: E0225 15:47:29.779769 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:31.779731953 +0000 UTC m=+102.793123993 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.825215 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15"} Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.825290 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243"} Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.828732 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679"} Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.841367 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.843329 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.843394 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.843411 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.843435 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.843454 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:29Z","lastTransitionTime":"2026-02-25T15:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.855785 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.867682 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.883893 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.896023 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.909702 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.925022 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.941898 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.946198 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.946254 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.946275 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.946306 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.946333 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:29Z","lastTransitionTime":"2026-02-25T15:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.953865 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.965371 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.976829 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:29 crc kubenswrapper[4937]: I0225 15:47:29.988478 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.049434 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.049477 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.049507 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.049526 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.049539 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:30Z","lastTransitionTime":"2026-02-25T15:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.152695 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.152756 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.152821 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.152846 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.152862 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:30Z","lastTransitionTime":"2026-02-25T15:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.198449 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.198536 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.198560 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.198591 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.198608 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:30Z","lastTransitionTime":"2026-02-25T15:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:30 crc kubenswrapper[4937]: E0225 15:47:30.213775 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.219274 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.219329 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.219347 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.219371 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.219388 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:30Z","lastTransitionTime":"2026-02-25T15:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:30 crc kubenswrapper[4937]: E0225 15:47:30.235277 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.239629 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.239680 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.239697 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.239718 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.239735 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:30Z","lastTransitionTime":"2026-02-25T15:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:30 crc kubenswrapper[4937]: E0225 15:47:30.254555 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.258501 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.258529 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.258537 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.258567 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.258576 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:30Z","lastTransitionTime":"2026-02-25T15:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:30 crc kubenswrapper[4937]: E0225 15:47:30.271777 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.276161 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.276198 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.276209 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.276227 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.276239 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:30Z","lastTransitionTime":"2026-02-25T15:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:30 crc kubenswrapper[4937]: E0225 15:47:30.291894 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 15:47:30 crc kubenswrapper[4937]: E0225 15:47:30.292196 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.294175 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.294239 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.294257 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.294280 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.294298 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:30Z","lastTransitionTime":"2026-02-25T15:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.367322 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.367421 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:30 crc kubenswrapper[4937]: E0225 15:47:30.367551 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:30 crc kubenswrapper[4937]: E0225 15:47:30.367693 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.397216 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.397260 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.397272 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.397288 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.397302 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:30Z","lastTransitionTime":"2026-02-25T15:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.500443 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.500537 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.500562 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.500597 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.500620 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:30Z","lastTransitionTime":"2026-02-25T15:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.603429 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.603510 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.603529 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.603552 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.603568 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:30Z","lastTransitionTime":"2026-02-25T15:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.705905 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.705996 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.706046 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.706073 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.706090 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:30Z","lastTransitionTime":"2026-02-25T15:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.809884 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.809945 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.809962 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.809987 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.810004 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:30Z","lastTransitionTime":"2026-02-25T15:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.911977 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.912024 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.912037 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.912054 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:30 crc kubenswrapper[4937]: I0225 15:47:30.912065 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:30Z","lastTransitionTime":"2026-02-25T15:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.015171 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.015358 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.015384 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.015408 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.015425 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:31Z","lastTransitionTime":"2026-02-25T15:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.118106 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.118136 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.118145 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.118159 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.118168 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:31Z","lastTransitionTime":"2026-02-25T15:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.221222 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.221276 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.221295 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.221321 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.221339 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:31Z","lastTransitionTime":"2026-02-25T15:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.323873 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.323930 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.323948 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.323971 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.323988 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:31Z","lastTransitionTime":"2026-02-25T15:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.366664 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:31 crc kubenswrapper[4937]: E0225 15:47:31.366862 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.388578 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.407806 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.427758 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.428338 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.428355 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.428379 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.428395 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:31Z","lastTransitionTime":"2026-02-25T15:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.429729 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.447896 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.463417 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.475349 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.531068 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.531098 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.531105 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.531120 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.531129 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:31Z","lastTransitionTime":"2026-02-25T15:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.633458 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.633536 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.633549 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.633567 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.633580 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:31Z","lastTransitionTime":"2026-02-25T15:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.703435 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:47:31 crc kubenswrapper[4937]: E0225 15:47:31.703619 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:47:35.703598983 +0000 UTC m=+106.716990883 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.703674 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.703711 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:31 crc kubenswrapper[4937]: E0225 15:47:31.703784 4937 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:47:31 crc kubenswrapper[4937]: E0225 15:47:31.703800 4937 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:47:31 crc kubenswrapper[4937]: E0225 15:47:31.703850 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:35.703839449 +0000 UTC m=+106.717231349 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:47:31 crc kubenswrapper[4937]: E0225 15:47:31.703864 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:35.7038578 +0000 UTC m=+106.717249700 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.736677 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.736755 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.736777 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.736801 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.736819 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:31Z","lastTransitionTime":"2026-02-25T15:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.804323 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.804383 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:31 crc kubenswrapper[4937]: E0225 15:47:31.804512 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:47:31 crc kubenswrapper[4937]: E0225 15:47:31.804535 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:47:31 crc kubenswrapper[4937]: E0225 15:47:31.804545 4937 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:31 crc kubenswrapper[4937]: E0225 15:47:31.804587 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:35.804575776 +0000 UTC m=+106.817967666 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:31 crc kubenswrapper[4937]: E0225 15:47:31.804512 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:47:31 crc kubenswrapper[4937]: E0225 15:47:31.804637 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:47:31 crc kubenswrapper[4937]: E0225 15:47:31.804649 4937 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:31 crc kubenswrapper[4937]: E0225 15:47:31.804682 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:35.804672119 +0000 UTC m=+106.818064009 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.835611 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc"} Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.839792 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.839867 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.839905 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.839937 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.839963 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:31Z","lastTransitionTime":"2026-02-25T15:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.942793 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.942834 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.942846 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.942863 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:31 crc kubenswrapper[4937]: I0225 15:47:31.942874 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:31Z","lastTransitionTime":"2026-02-25T15:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.045010 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.045053 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.045066 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.045082 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.045095 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:32Z","lastTransitionTime":"2026-02-25T15:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.147774 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.147815 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.147826 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.147841 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.147852 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:32Z","lastTransitionTime":"2026-02-25T15:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.250775 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.251127 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.251140 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.251158 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.251171 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:32Z","lastTransitionTime":"2026-02-25T15:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.353899 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.353938 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.353951 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.353973 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.353985 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:32Z","lastTransitionTime":"2026-02-25T15:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.367159 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.367178 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:32 crc kubenswrapper[4937]: E0225 15:47:32.367288 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:32 crc kubenswrapper[4937]: E0225 15:47:32.367391 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.455753 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.455795 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.455809 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.455829 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.455840 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:32Z","lastTransitionTime":"2026-02-25T15:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.559262 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.559314 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.559327 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.559345 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.559360 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:32Z","lastTransitionTime":"2026-02-25T15:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.674875 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.674931 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.674945 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.674977 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.674989 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:32Z","lastTransitionTime":"2026-02-25T15:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.778415 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.778512 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.778539 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.778567 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.778629 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:32Z","lastTransitionTime":"2026-02-25T15:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.859942 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.879725 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.880908 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.880989 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.881002 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.881020 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.881031 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:32Z","lastTransitionTime":"2026-02-25T15:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.901964 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.921304 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.941334 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.960990 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.983442 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.983531 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.983552 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.983580 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:32 crc kubenswrapper[4937]: I0225 15:47:32.983597 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:32Z","lastTransitionTime":"2026-02-25T15:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.086744 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.086842 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.086862 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.086886 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.086903 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:33Z","lastTransitionTime":"2026-02-25T15:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.190228 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.190285 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.190306 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.190329 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.190346 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:33Z","lastTransitionTime":"2026-02-25T15:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.292997 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.293054 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.293070 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.293095 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.293111 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:33Z","lastTransitionTime":"2026-02-25T15:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.367360 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:33 crc kubenswrapper[4937]: E0225 15:47:33.367515 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.395304 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.395360 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.395378 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.395400 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.395417 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:33Z","lastTransitionTime":"2026-02-25T15:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.498207 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.498269 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.498290 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.498318 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.498339 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:33Z","lastTransitionTime":"2026-02-25T15:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.601513 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.601567 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.601588 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.601612 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.601628 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:33Z","lastTransitionTime":"2026-02-25T15:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.707475 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.707645 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.707672 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.707704 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.707726 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:33Z","lastTransitionTime":"2026-02-25T15:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.810434 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.810535 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.810558 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.810589 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.810616 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:33Z","lastTransitionTime":"2026-02-25T15:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.913687 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.913729 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.913741 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.913760 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:33 crc kubenswrapper[4937]: I0225 15:47:33.913771 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:33Z","lastTransitionTime":"2026-02-25T15:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.017143 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.017205 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.017222 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.017245 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.017263 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:34Z","lastTransitionTime":"2026-02-25T15:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.120217 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.120870 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.120954 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.120993 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.121015 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:34Z","lastTransitionTime":"2026-02-25T15:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.223714 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.223776 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.223792 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.223815 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.223830 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:34Z","lastTransitionTime":"2026-02-25T15:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.327077 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.327139 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.327157 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.327183 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.327200 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:34Z","lastTransitionTime":"2026-02-25T15:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.367064 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.367143 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:34 crc kubenswrapper[4937]: E0225 15:47:34.367385 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:34 crc kubenswrapper[4937]: E0225 15:47:34.367571 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.381747 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.382948 4937 scope.go:117] "RemoveContainer" containerID="74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527" Feb 25 15:47:34 crc kubenswrapper[4937]: E0225 15:47:34.383113 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.429399 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.429465 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.429520 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.429559 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.429619 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:34Z","lastTransitionTime":"2026-02-25T15:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.531986 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.532031 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.532043 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.532061 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.532074 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:34Z","lastTransitionTime":"2026-02-25T15:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.634589 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.634658 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.634682 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.634709 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.634758 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:34Z","lastTransitionTime":"2026-02-25T15:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.736977 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.737021 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.737033 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.737053 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.737065 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:34Z","lastTransitionTime":"2026-02-25T15:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.839475 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.839594 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.839618 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.839649 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.839671 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:34Z","lastTransitionTime":"2026-02-25T15:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.842837 4937 scope.go:117] "RemoveContainer" containerID="74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527" Feb 25 15:47:34 crc kubenswrapper[4937]: E0225 15:47:34.842992 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.942015 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.942069 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.942085 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.942110 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:34 crc kubenswrapper[4937]: I0225 15:47:34.942128 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:34Z","lastTransitionTime":"2026-02-25T15:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.044977 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.045045 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.045072 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.045095 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.045112 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:35Z","lastTransitionTime":"2026-02-25T15:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.147377 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.147429 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.147445 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.147467 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.147518 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:35Z","lastTransitionTime":"2026-02-25T15:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.249326 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.249363 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.249374 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.249389 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.249400 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:35Z","lastTransitionTime":"2026-02-25T15:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.352221 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.352300 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.352334 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.352365 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.352388 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:35Z","lastTransitionTime":"2026-02-25T15:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.366631 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:35 crc kubenswrapper[4937]: E0225 15:47:35.366905 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.455407 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.455470 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.455514 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.455539 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.455556 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:35Z","lastTransitionTime":"2026-02-25T15:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.559296 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.559355 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.559374 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.559398 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.559416 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:35Z","lastTransitionTime":"2026-02-25T15:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.662840 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.662899 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.662922 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.662948 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.662965 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:35Z","lastTransitionTime":"2026-02-25T15:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.742976 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.743113 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.743180 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:35 crc kubenswrapper[4937]: E0225 15:47:35.743228 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:47:43.743192816 +0000 UTC m=+114.756584766 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:47:35 crc kubenswrapper[4937]: E0225 15:47:35.743318 4937 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:47:35 crc kubenswrapper[4937]: E0225 15:47:35.743343 4937 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:47:35 crc kubenswrapper[4937]: E0225 15:47:35.743408 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:43.74338774 +0000 UTC m=+114.756779660 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:47:35 crc kubenswrapper[4937]: E0225 15:47:35.743433 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:43.743421511 +0000 UTC m=+114.756813441 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.765382 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.765448 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.765471 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.765527 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.765550 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:35Z","lastTransitionTime":"2026-02-25T15:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.843950 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.844088 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:35 crc kubenswrapper[4937]: E0225 15:47:35.844172 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:47:35 crc kubenswrapper[4937]: E0225 15:47:35.844209 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:47:35 crc kubenswrapper[4937]: E0225 15:47:35.844233 4937 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:35 crc kubenswrapper[4937]: E0225 15:47:35.844341 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:43.844311762 +0000 UTC m=+114.857703692 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:35 crc kubenswrapper[4937]: E0225 15:47:35.844399 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:47:35 crc kubenswrapper[4937]: E0225 15:47:35.844450 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:47:35 crc kubenswrapper[4937]: E0225 15:47:35.844474 4937 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:35 crc kubenswrapper[4937]: E0225 15:47:35.844599 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:43.844574709 +0000 UTC m=+114.857966629 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.868068 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.868170 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.868190 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.868306 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.868330 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:35Z","lastTransitionTime":"2026-02-25T15:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.970697 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.970764 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.970791 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.970821 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:35 crc kubenswrapper[4937]: I0225 15:47:35.970845 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:35Z","lastTransitionTime":"2026-02-25T15:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.073974 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.074042 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.074067 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.074096 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.074120 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:36Z","lastTransitionTime":"2026-02-25T15:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.177448 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.177557 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.177578 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.177601 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.177618 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:36Z","lastTransitionTime":"2026-02-25T15:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.281537 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.281603 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.281621 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.281648 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.281669 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:36Z","lastTransitionTime":"2026-02-25T15:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.367016 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.367015 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:36 crc kubenswrapper[4937]: E0225 15:47:36.367184 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:36 crc kubenswrapper[4937]: E0225 15:47:36.367310 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.384758 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.384808 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.384824 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.384849 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.384874 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:36Z","lastTransitionTime":"2026-02-25T15:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.487399 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.487431 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.487458 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.487471 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.487479 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:36Z","lastTransitionTime":"2026-02-25T15:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.590520 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.590578 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.590595 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.590618 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.590636 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:36Z","lastTransitionTime":"2026-02-25T15:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.693356 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.693411 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.693429 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.693452 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.693470 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:36Z","lastTransitionTime":"2026-02-25T15:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.796227 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.796269 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.796280 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.796297 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.796308 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:36Z","lastTransitionTime":"2026-02-25T15:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.898252 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.898336 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.898362 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.898438 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:36 crc kubenswrapper[4937]: I0225 15:47:36.898553 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:36Z","lastTransitionTime":"2026-02-25T15:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.001518 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.001597 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.001635 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.001664 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.001686 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:37Z","lastTransitionTime":"2026-02-25T15:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.104195 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.104259 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.104278 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.104300 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.104319 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:37Z","lastTransitionTime":"2026-02-25T15:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.206728 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.206799 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.206817 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.206840 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.206857 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:37Z","lastTransitionTime":"2026-02-25T15:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.309732 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.309841 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.309864 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.309888 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.309904 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:37Z","lastTransitionTime":"2026-02-25T15:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.367213 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:37 crc kubenswrapper[4937]: E0225 15:47:37.367406 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.412903 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.412948 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.412958 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.412973 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.412984 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:37Z","lastTransitionTime":"2026-02-25T15:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.516565 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.516629 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.516643 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.516665 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.516679 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:37Z","lastTransitionTime":"2026-02-25T15:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.619447 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.619538 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.619558 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.619582 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.619596 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:37Z","lastTransitionTime":"2026-02-25T15:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.722638 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.722685 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.722695 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.722713 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.722723 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:37Z","lastTransitionTime":"2026-02-25T15:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.825563 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.825621 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.825638 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.825661 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.825677 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:37Z","lastTransitionTime":"2026-02-25T15:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.929082 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.929167 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.929191 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.929226 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:37 crc kubenswrapper[4937]: I0225 15:47:37.929247 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:37Z","lastTransitionTime":"2026-02-25T15:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.032425 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.032528 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.032554 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.032584 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.032606 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:38Z","lastTransitionTime":"2026-02-25T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.135823 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.135893 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.135912 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.135938 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.135957 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:38Z","lastTransitionTime":"2026-02-25T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.238564 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.238623 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.238640 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.238662 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.238684 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:38Z","lastTransitionTime":"2026-02-25T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.342730 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.342806 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.342828 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.342853 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.342871 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:38Z","lastTransitionTime":"2026-02-25T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.366995 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.367241 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:38 crc kubenswrapper[4937]: E0225 15:47:38.367298 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:38 crc kubenswrapper[4937]: E0225 15:47:38.367358 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.446262 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.446326 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.446348 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.446377 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.446402 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:38Z","lastTransitionTime":"2026-02-25T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.549148 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.549232 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.549252 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.549275 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.549293 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:38Z","lastTransitionTime":"2026-02-25T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.652265 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.652337 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.652358 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.652385 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.652403 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:38Z","lastTransitionTime":"2026-02-25T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.755941 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.756004 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.756024 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.756049 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.756066 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:38Z","lastTransitionTime":"2026-02-25T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.859259 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.859325 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.859337 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.859354 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.859366 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:38Z","lastTransitionTime":"2026-02-25T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.962703 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.962773 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.962793 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.962816 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:38 crc kubenswrapper[4937]: I0225 15:47:38.962834 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:38Z","lastTransitionTime":"2026-02-25T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.065912 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.065963 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.065980 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.066000 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.066015 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:39Z","lastTransitionTime":"2026-02-25T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.168198 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.168281 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.168306 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.168335 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.168352 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:39Z","lastTransitionTime":"2026-02-25T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.271719 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.271782 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.271800 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.271827 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.271844 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:39Z","lastTransitionTime":"2026-02-25T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.366974 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:39 crc kubenswrapper[4937]: E0225 15:47:39.367513 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.375676 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.375725 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.375747 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.375776 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.375797 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:39Z","lastTransitionTime":"2026-02-25T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.384769 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.479518 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.479569 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.479620 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.479646 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.479664 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:39Z","lastTransitionTime":"2026-02-25T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.583673 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.584178 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.584209 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.584244 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.584272 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:39Z","lastTransitionTime":"2026-02-25T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.687189 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.687251 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.687293 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.687319 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.687336 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:39Z","lastTransitionTime":"2026-02-25T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.790309 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.790358 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.790370 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.790387 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.790398 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:39Z","lastTransitionTime":"2026-02-25T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.894395 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.894433 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.894442 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.894457 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.894467 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:39Z","lastTransitionTime":"2026-02-25T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.997096 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.997156 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.997173 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.997200 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:39 crc kubenswrapper[4937]: I0225 15:47:39.997217 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:39Z","lastTransitionTime":"2026-02-25T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.099342 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.099383 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.099397 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.099414 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.099426 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:40Z","lastTransitionTime":"2026-02-25T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.202471 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.202579 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.202593 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.202658 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.202675 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:40Z","lastTransitionTime":"2026-02-25T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.307122 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.307177 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.307191 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.307209 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.307226 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:40Z","lastTransitionTime":"2026-02-25T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.367133 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.367154 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:40 crc kubenswrapper[4937]: E0225 15:47:40.367368 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:40 crc kubenswrapper[4937]: E0225 15:47:40.367587 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.380265 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.380320 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.380338 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.380362 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.380379 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:40Z","lastTransitionTime":"2026-02-25T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:40 crc kubenswrapper[4937]: E0225 15:47:40.400475 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:40Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.405229 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.405269 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.405281 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.405299 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.405311 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:40Z","lastTransitionTime":"2026-02-25T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:40 crc kubenswrapper[4937]: E0225 15:47:40.425552 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:40Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.430237 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.430296 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.430313 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.430337 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.430356 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:40Z","lastTransitionTime":"2026-02-25T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:40 crc kubenswrapper[4937]: E0225 15:47:40.453507 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:40Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.460657 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.460725 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.460755 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.460779 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.460804 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:40Z","lastTransitionTime":"2026-02-25T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:40 crc kubenswrapper[4937]: E0225 15:47:40.584712 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:40Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.589198 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.589243 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.589253 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.589270 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.589282 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:40Z","lastTransitionTime":"2026-02-25T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:40 crc kubenswrapper[4937]: E0225 15:47:40.606004 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:40Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:40 crc kubenswrapper[4937]: E0225 15:47:40.606143 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.607763 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.607791 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.607802 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.607819 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.607829 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:40Z","lastTransitionTime":"2026-02-25T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.711318 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.711370 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.711382 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.711398 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.711410 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:40Z","lastTransitionTime":"2026-02-25T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.751620 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-vrqcw"] Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.752083 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vrqcw" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.755162 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.757992 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.758216 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.768669 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4wr2\" (UniqueName: \"kubernetes.io/projected/b0f809a1-5ded-4908-ab90-a91c806e2302-kube-api-access-p4wr2\") pod \"node-resolver-vrqcw\" (UID: \"b0f809a1-5ded-4908-ab90-a91c806e2302\") " pod="openshift-dns/node-resolver-vrqcw" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.768789 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0f809a1-5ded-4908-ab90-a91c806e2302-hosts-file\") pod \"node-resolver-vrqcw\" (UID: \"b0f809a1-5ded-4908-ab90-a91c806e2302\") " pod="openshift-dns/node-resolver-vrqcw" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.775774 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:40Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.788619 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:40Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.809949 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:40Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.814218 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.814308 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.814329 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.814353 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.814369 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:40Z","lastTransitionTime":"2026-02-25T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.826294 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:40Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.846910 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:40Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.865076 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:40Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.870971 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0f809a1-5ded-4908-ab90-a91c806e2302-hosts-file\") pod \"node-resolver-vrqcw\" (UID: \"b0f809a1-5ded-4908-ab90-a91c806e2302\") " pod="openshift-dns/node-resolver-vrqcw" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.871026 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4wr2\" (UniqueName: \"kubernetes.io/projected/b0f809a1-5ded-4908-ab90-a91c806e2302-kube-api-access-p4wr2\") pod \"node-resolver-vrqcw\" (UID: \"b0f809a1-5ded-4908-ab90-a91c806e2302\") " pod="openshift-dns/node-resolver-vrqcw" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.871143 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0f809a1-5ded-4908-ab90-a91c806e2302-hosts-file\") pod \"node-resolver-vrqcw\" (UID: \"b0f809a1-5ded-4908-ab90-a91c806e2302\") " pod="openshift-dns/node-resolver-vrqcw" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.884625 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:40Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.905989 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4wr2\" (UniqueName: \"kubernetes.io/projected/b0f809a1-5ded-4908-ab90-a91c806e2302-kube-api-access-p4wr2\") pod \"node-resolver-vrqcw\" (UID: \"b0f809a1-5ded-4908-ab90-a91c806e2302\") " pod="openshift-dns/node-resolver-vrqcw" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.917829 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.918098 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.918188 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.918279 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.918363 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:40Z","lastTransitionTime":"2026-02-25T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.920032 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:40Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:40 crc kubenswrapper[4937]: I0225 15:47:40.943580 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:40Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.021611 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.021665 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.021681 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.021706 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.021723 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:41Z","lastTransitionTime":"2026-02-25T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.075638 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vrqcw" Feb 25 15:47:41 crc kubenswrapper[4937]: W0225 15:47:41.101139 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0f809a1_5ded_4908_ab90_a91c806e2302.slice/crio-95c45a6a8d547555248e8ffa6f875583145453317bd6feabe609412fdb1fc56e WatchSource:0}: Error finding container 95c45a6a8d547555248e8ffa6f875583145453317bd6feabe609412fdb1fc56e: Status 404 returned error can't find the container with id 95c45a6a8d547555248e8ffa6f875583145453317bd6feabe609412fdb1fc56e Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.124662 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.124717 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.124735 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.124759 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.124782 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:41Z","lastTransitionTime":"2026-02-25T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.135020 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-dlbgx"] Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.135447 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.142777 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.143181 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.143432 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.143703 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.143892 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.145136 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-crvn5"] Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.146358 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-2r4xd"] Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.146624 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.146896 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.151898 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.153024 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.153316 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.153419 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.153812 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.153967 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.154279 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.159788 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174214 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-var-lib-cni-bin\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174254 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3aad09c8-a744-4e42-a270-8cfee256b07f-os-release\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174278 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8rbb\" (UniqueName: \"kubernetes.io/projected/8f826096-fb93-42fe-a779-9afe1d36f2d4-kube-api-access-p8rbb\") pod \"machine-config-daemon-2r4xd\" (UID: \"8f826096-fb93-42fe-a779-9afe1d36f2d4\") " pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174302 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-system-cni-dir\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174317 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-run-netns\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174345 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-run-k8s-cni-cncf-io\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174360 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-run-multus-certs\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174378 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3aad09c8-a744-4e42-a270-8cfee256b07f-cnibin\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174397 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f826096-fb93-42fe-a779-9afe1d36f2d4-mcd-auth-proxy-config\") pod \"machine-config-daemon-2r4xd\" (UID: \"8f826096-fb93-42fe-a779-9afe1d36f2d4\") " pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174417 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-var-lib-kubelet\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174438 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8f826096-fb93-42fe-a779-9afe1d36f2d4-rootfs\") pod \"machine-config-daemon-2r4xd\" (UID: \"8f826096-fb93-42fe-a779-9afe1d36f2d4\") " pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174457 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vxd2\" (UniqueName: \"kubernetes.io/projected/f193b13f-50ab-454a-9230-a96922b25186-kube-api-access-5vxd2\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174475 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-multus-conf-dir\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174513 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3aad09c8-a744-4e42-a270-8cfee256b07f-system-cni-dir\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174537 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmvp2\" (UniqueName: \"kubernetes.io/projected/3aad09c8-a744-4e42-a270-8cfee256b07f-kube-api-access-zmvp2\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174564 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-cnibin\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174585 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3aad09c8-a744-4e42-a270-8cfee256b07f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174611 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-multus-socket-dir-parent\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174633 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f826096-fb93-42fe-a779-9afe1d36f2d4-proxy-tls\") pod \"machine-config-daemon-2r4xd\" (UID: \"8f826096-fb93-42fe-a779-9afe1d36f2d4\") " pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174649 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-etc-kubernetes\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174677 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f193b13f-50ab-454a-9230-a96922b25186-multus-daemon-config\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174691 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-multus-cni-dir\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174705 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f193b13f-50ab-454a-9230-a96922b25186-cni-binary-copy\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174721 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-var-lib-cni-multus\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174736 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3aad09c8-a744-4e42-a270-8cfee256b07f-cni-binary-copy\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174751 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3aad09c8-a744-4e42-a270-8cfee256b07f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174766 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-os-release\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.174781 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-hostroot\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.175688 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.192903 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.214543 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.227563 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.227600 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.227611 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.227627 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.227638 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:41Z","lastTransitionTime":"2026-02-25T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.231539 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.244562 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.256075 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.270860 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.275876 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-multus-cni-dir\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.275908 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f193b13f-50ab-454a-9230-a96922b25186-cni-binary-copy\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.275928 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-var-lib-cni-multus\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.275947 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3aad09c8-a744-4e42-a270-8cfee256b07f-cni-binary-copy\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.275967 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3aad09c8-a744-4e42-a270-8cfee256b07f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.275987 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-os-release\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276006 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-hostroot\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276026 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3aad09c8-a744-4e42-a270-8cfee256b07f-os-release\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276047 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8rbb\" (UniqueName: \"kubernetes.io/projected/8f826096-fb93-42fe-a779-9afe1d36f2d4-kube-api-access-p8rbb\") pod \"machine-config-daemon-2r4xd\" (UID: \"8f826096-fb93-42fe-a779-9afe1d36f2d4\") " pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276067 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-system-cni-dir\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276087 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-run-netns\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276107 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-var-lib-cni-bin\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276134 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-run-k8s-cni-cncf-io\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276155 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-run-multus-certs\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276183 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3aad09c8-a744-4e42-a270-8cfee256b07f-cnibin\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276202 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f826096-fb93-42fe-a779-9afe1d36f2d4-mcd-auth-proxy-config\") pod \"machine-config-daemon-2r4xd\" (UID: \"8f826096-fb93-42fe-a779-9afe1d36f2d4\") " pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276223 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-var-lib-kubelet\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276241 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8f826096-fb93-42fe-a779-9afe1d36f2d4-rootfs\") pod \"machine-config-daemon-2r4xd\" (UID: \"8f826096-fb93-42fe-a779-9afe1d36f2d4\") " pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276261 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vxd2\" (UniqueName: \"kubernetes.io/projected/f193b13f-50ab-454a-9230-a96922b25186-kube-api-access-5vxd2\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276282 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-multus-conf-dir\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276301 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3aad09c8-a744-4e42-a270-8cfee256b07f-system-cni-dir\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276322 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmvp2\" (UniqueName: \"kubernetes.io/projected/3aad09c8-a744-4e42-a270-8cfee256b07f-kube-api-access-zmvp2\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276342 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-cnibin\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276361 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3aad09c8-a744-4e42-a270-8cfee256b07f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276384 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-multus-socket-dir-parent\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276405 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f826096-fb93-42fe-a779-9afe1d36f2d4-proxy-tls\") pod \"machine-config-daemon-2r4xd\" (UID: \"8f826096-fb93-42fe-a779-9afe1d36f2d4\") " pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276424 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-etc-kubernetes\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276459 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f193b13f-50ab-454a-9230-a96922b25186-multus-daemon-config\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276801 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-hostroot\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276809 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-multus-cni-dir\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.276855 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3aad09c8-a744-4e42-a270-8cfee256b07f-cnibin\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.277448 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f193b13f-50ab-454a-9230-a96922b25186-cni-binary-copy\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.277506 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f826096-fb93-42fe-a779-9afe1d36f2d4-mcd-auth-proxy-config\") pod \"machine-config-daemon-2r4xd\" (UID: \"8f826096-fb93-42fe-a779-9afe1d36f2d4\") " pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.277555 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-var-lib-kubelet\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.277560 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-var-lib-cni-multus\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.277587 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8f826096-fb93-42fe-a779-9afe1d36f2d4-rootfs\") pod \"machine-config-daemon-2r4xd\" (UID: \"8f826096-fb93-42fe-a779-9afe1d36f2d4\") " pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.277801 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-multus-conf-dir\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.277842 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3aad09c8-a744-4e42-a270-8cfee256b07f-system-cni-dir\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.278173 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-cnibin\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.278175 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3aad09c8-a744-4e42-a270-8cfee256b07f-cni-binary-copy\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.278227 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-os-release\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.278283 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3aad09c8-a744-4e42-a270-8cfee256b07f-os-release\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.278316 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-var-lib-cni-bin\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.278345 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-run-netns\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.278329 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-system-cni-dir\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.278668 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3aad09c8-a744-4e42-a270-8cfee256b07f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.278734 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-multus-socket-dir-parent\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.278771 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-etc-kubernetes\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.278808 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-run-k8s-cni-cncf-io\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.278821 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f193b13f-50ab-454a-9230-a96922b25186-host-run-multus-certs\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.279290 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3aad09c8-a744-4e42-a270-8cfee256b07f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.280777 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f193b13f-50ab-454a-9230-a96922b25186-multus-daemon-config\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.285535 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f826096-fb93-42fe-a779-9afe1d36f2d4-proxy-tls\") pod \"machine-config-daemon-2r4xd\" (UID: \"8f826096-fb93-42fe-a779-9afe1d36f2d4\") " pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.285898 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.293104 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8rbb\" (UniqueName: \"kubernetes.io/projected/8f826096-fb93-42fe-a779-9afe1d36f2d4-kube-api-access-p8rbb\") pod \"machine-config-daemon-2r4xd\" (UID: \"8f826096-fb93-42fe-a779-9afe1d36f2d4\") " pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.293970 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vxd2\" (UniqueName: \"kubernetes.io/projected/f193b13f-50ab-454a-9230-a96922b25186-kube-api-access-5vxd2\") pod \"multus-dlbgx\" (UID: \"f193b13f-50ab-454a-9230-a96922b25186\") " pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.300102 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.305649 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmvp2\" (UniqueName: \"kubernetes.io/projected/3aad09c8-a744-4e42-a270-8cfee256b07f-kube-api-access-zmvp2\") pod \"multus-additional-cni-plugins-crvn5\" (UID: \"3aad09c8-a744-4e42-a270-8cfee256b07f\") " pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.318322 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.330061 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.330092 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.330151 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.330171 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.330184 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:41Z","lastTransitionTime":"2026-02-25T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.330732 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.343739 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.358645 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.367287 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:41 crc kubenswrapper[4937]: E0225 15:47:41.367378 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.379902 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.403416 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.415024 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.427182 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.434239 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.434276 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.434284 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.434299 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.434310 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:41Z","lastTransitionTime":"2026-02-25T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.441380 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.454771 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.470940 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.477860 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dlbgx" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.483913 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: W0225 15:47:41.490699 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf193b13f_50ab_454a_9230_a96922b25186.slice/crio-79df383be113f71c0651139e5ad92cc79f1856f3617370940f1d3a30fe08c4c3 WatchSource:0}: Error finding container 79df383be113f71c0651139e5ad92cc79f1856f3617370940f1d3a30fe08c4c3: Status 404 returned error can't find the container with id 79df383be113f71c0651139e5ad92cc79f1856f3617370940f1d3a30fe08c4c3 Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.492673 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-crvn5" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.493859 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.500119 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.514604 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.517103 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cl2zn"] Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.518037 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.520010 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.520068 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.520372 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.520461 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.520580 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.520682 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.521349 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.539998 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.540026 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.540035 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.540051 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.540079 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:41Z","lastTransitionTime":"2026-02-25T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.542518 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.554608 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.564573 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.575422 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: W0225 15:47:41.577821 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aad09c8_a744_4e42_a270_8cfee256b07f.slice/crio-b4552c7f33cb5719e0a62614e93b695e9426a14c40f59ead1a56c9445f9488db WatchSource:0}: Error finding container b4552c7f33cb5719e0a62614e93b695e9426a14c40f59ead1a56c9445f9488db: Status 404 returned error can't find the container with id b4552c7f33cb5719e0a62614e93b695e9426a14c40f59ead1a56c9445f9488db Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578388 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2px4\" (UniqueName: \"kubernetes.io/projected/89a5d3cb-d884-4e27-90df-972e98830bcb-kube-api-access-r2px4\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578424 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578446 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89a5d3cb-d884-4e27-90df-972e98830bcb-ovn-node-metrics-cert\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578468 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-etc-openvswitch\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578524 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-openvswitch\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578550 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-ovn\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578568 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-cni-netd\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578669 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-kubelet\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578814 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-systemd-units\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578848 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-slash\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578870 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-systemd\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578910 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-cni-bin\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578931 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-ovnkube-script-lib\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578952 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-var-lib-openvswitch\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578974 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-env-overrides\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.578996 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-run-netns\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.579017 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-node-log\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.579037 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-ovnkube-config\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.579060 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-run-ovn-kubernetes\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.579896 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-log-socket\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.589053 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.599814 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.623120 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.638442 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.642623 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.642666 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.642678 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.642696 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.642710 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:41Z","lastTransitionTime":"2026-02-25T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.653106 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.669647 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.681607 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-run-netns\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.681661 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-node-log\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.681704 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-ovnkube-config\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.681739 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-run-ovn-kubernetes\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.681810 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-log-socket\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.681844 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2px4\" (UniqueName: \"kubernetes.io/projected/89a5d3cb-d884-4e27-90df-972e98830bcb-kube-api-access-r2px4\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.681881 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.681938 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89a5d3cb-d884-4e27-90df-972e98830bcb-ovn-node-metrics-cert\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.681977 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-etc-openvswitch\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.682069 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-openvswitch\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.682107 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-ovn\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.682139 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-cni-netd\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.682175 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-kubelet\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.682205 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-systemd\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.682263 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-systemd-units\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.682298 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-slash\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.682330 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-cni-bin\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.682361 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-ovnkube-script-lib\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.682394 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-var-lib-openvswitch\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.682426 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-env-overrides\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.682976 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-run-netns\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.683085 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-node-log\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.683167 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.683281 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-ovn\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.683400 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-cni-netd\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.683521 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-kubelet\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.683053 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-openvswitch\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.683963 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-var-lib-openvswitch\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.684062 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-run-ovn-kubernetes\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.684103 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-log-socket\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.684333 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-env-overrides\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.684445 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-systemd-units\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.684551 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-systemd\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.684596 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-etc-openvswitch\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.684617 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-slash\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.684666 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-ovnkube-config\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.684780 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-ovnkube-script-lib\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.685583 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-cni-bin\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.688992 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.689243 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89a5d3cb-d884-4e27-90df-972e98830bcb-ovn-node-metrics-cert\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.705643 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.706155 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2px4\" (UniqueName: \"kubernetes.io/projected/89a5d3cb-d884-4e27-90df-972e98830bcb-kube-api-access-r2px4\") pod \"ovnkube-node-cl2zn\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.723124 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.744028 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.746957 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.747010 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.747027 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.747052 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.747072 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:41Z","lastTransitionTime":"2026-02-25T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.763583 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.800479 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.820173 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.833130 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.833125 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.849316 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.849376 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.849392 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.849411 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.849426 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:41Z","lastTransitionTime":"2026-02-25T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.854052 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.876155 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"deeba04fe832fe5ec1ed73afce3c681b739a013d5ffb6b990fd0ac7bf7e48f60"} Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.878432 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlbgx" event={"ID":"f193b13f-50ab-454a-9230-a96922b25186","Type":"ContainerStarted","Data":"79df383be113f71c0651139e5ad92cc79f1856f3617370940f1d3a30fe08c4c3"} Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.880798 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vrqcw" event={"ID":"b0f809a1-5ded-4908-ab90-a91c806e2302","Type":"ContainerStarted","Data":"95c45a6a8d547555248e8ffa6f875583145453317bd6feabe609412fdb1fc56e"} Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.882638 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" event={"ID":"3aad09c8-a744-4e42-a270-8cfee256b07f","Type":"ContainerStarted","Data":"b4552c7f33cb5719e0a62614e93b695e9426a14c40f59ead1a56c9445f9488db"} Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.898919 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.923803 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: W0225 15:47:41.930430 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89a5d3cb_d884_4e27_90df_972e98830bcb.slice/crio-68e4cb7e0fbef2b311822beb59b0976faca4be58ed711777bb3daa0e8856b5cd WatchSource:0}: Error finding container 68e4cb7e0fbef2b311822beb59b0976faca4be58ed711777bb3daa0e8856b5cd: Status 404 returned error can't find the container with id 68e4cb7e0fbef2b311822beb59b0976faca4be58ed711777bb3daa0e8856b5cd Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.942422 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.952134 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.952200 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.952214 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.952237 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.952254 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:41Z","lastTransitionTime":"2026-02-25T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:41 crc kubenswrapper[4937]: I0225 15:47:41.961046 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.054919 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.055287 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.055301 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.055320 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.055332 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:42Z","lastTransitionTime":"2026-02-25T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.158564 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.158636 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.158654 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.158682 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.158699 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:42Z","lastTransitionTime":"2026-02-25T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.261717 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.261772 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.261789 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.261810 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.261828 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:42Z","lastTransitionTime":"2026-02-25T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.363740 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.363770 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.363778 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.363792 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.363802 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:42Z","lastTransitionTime":"2026-02-25T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.367210 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:42 crc kubenswrapper[4937]: E0225 15:47:42.367368 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.367210 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:42 crc kubenswrapper[4937]: E0225 15:47:42.367533 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.466753 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.466810 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.466826 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.466846 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.466861 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:42Z","lastTransitionTime":"2026-02-25T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.569061 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.569401 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.569666 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.569858 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.570044 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:42Z","lastTransitionTime":"2026-02-25T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.673273 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.673303 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.673311 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.673324 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.673334 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:42Z","lastTransitionTime":"2026-02-25T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.776379 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.776443 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.776458 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.776504 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.776532 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:42Z","lastTransitionTime":"2026-02-25T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.879558 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.879604 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.879618 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.879636 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.879648 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:42Z","lastTransitionTime":"2026-02-25T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.886358 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlbgx" event={"ID":"f193b13f-50ab-454a-9230-a96922b25186","Type":"ContainerStarted","Data":"55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.888130 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vrqcw" event={"ID":"b0f809a1-5ded-4908-ab90-a91c806e2302","Type":"ContainerStarted","Data":"007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.890355 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" event={"ID":"3aad09c8-a744-4e42-a270-8cfee256b07f","Type":"ContainerStarted","Data":"8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.892522 4937 generic.go:334] "Generic (PLEG): container finished" podID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerID="1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411" exitCode=0 Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.892531 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerDied","Data":"1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.892671 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerStarted","Data":"68e4cb7e0fbef2b311822beb59b0976faca4be58ed711777bb3daa0e8856b5cd"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.894309 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.907404 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:42Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.920236 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:42Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.941970 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:42Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.969920 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:42Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.984914 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.984975 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.984992 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.985011 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.985022 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:42Z","lastTransitionTime":"2026-02-25T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:42 crc kubenswrapper[4937]: I0225 15:47:42.987734 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:42Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.004764 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.021532 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.040885 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.060768 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.073770 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.087539 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.087582 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.087595 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.087612 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.087624 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:43Z","lastTransitionTime":"2026-02-25T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.088554 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.102597 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.125381 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.142385 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.161020 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.175650 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.189642 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.191555 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.191636 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.191654 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.191679 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.191697 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:43Z","lastTransitionTime":"2026-02-25T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.215368 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.233536 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.259767 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.283923 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.294528 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.294587 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.294605 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.294631 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.294657 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:43Z","lastTransitionTime":"2026-02-25T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.301248 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.323652 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.347415 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.366397 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.366949 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:43 crc kubenswrapper[4937]: E0225 15:47:43.367151 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.397984 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.398060 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.398077 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.398103 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.398121 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:43Z","lastTransitionTime":"2026-02-25T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.403156 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.501859 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.501926 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.501943 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.501968 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.501988 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:43Z","lastTransitionTime":"2026-02-25T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.605010 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.605086 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.605108 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.605138 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.605159 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:43Z","lastTransitionTime":"2026-02-25T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.707666 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.707728 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.707746 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.707773 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.707791 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:43Z","lastTransitionTime":"2026-02-25T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.806121 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.806213 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:43 crc kubenswrapper[4937]: E0225 15:47:43.806307 4937 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:47:43 crc kubenswrapper[4937]: E0225 15:47:43.806308 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:47:59.806270013 +0000 UTC m=+130.819661933 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:47:43 crc kubenswrapper[4937]: E0225 15:47:43.806358 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:59.806343465 +0000 UTC m=+130.819735365 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.806479 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:43 crc kubenswrapper[4937]: E0225 15:47:43.806667 4937 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:47:43 crc kubenswrapper[4937]: E0225 15:47:43.806741 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:59.806727195 +0000 UTC m=+130.820119115 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.810321 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.810374 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.810445 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.810469 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.810507 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:43Z","lastTransitionTime":"2026-02-25T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.901340 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228"} Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.903263 4937 generic.go:334] "Generic (PLEG): container finished" podID="3aad09c8-a744-4e42-a270-8cfee256b07f" containerID="8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc" exitCode=0 Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.903384 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" event={"ID":"3aad09c8-a744-4e42-a270-8cfee256b07f","Type":"ContainerDied","Data":"8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc"} Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.905703 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerStarted","Data":"ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46"} Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.907213 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.907291 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:43 crc kubenswrapper[4937]: E0225 15:47:43.907457 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:47:43 crc kubenswrapper[4937]: E0225 15:47:43.907469 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:47:43 crc kubenswrapper[4937]: E0225 15:47:43.907514 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:47:43 crc kubenswrapper[4937]: E0225 15:47:43.907525 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:47:43 crc kubenswrapper[4937]: E0225 15:47:43.907533 4937 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:43 crc kubenswrapper[4937]: E0225 15:47:43.907545 4937 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:43 crc kubenswrapper[4937]: E0225 15:47:43.907603 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:59.907582535 +0000 UTC m=+130.920974465 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:43 crc kubenswrapper[4937]: E0225 15:47:43.907629 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:59.907617936 +0000 UTC m=+130.921009856 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.913243 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.913301 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.913318 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.913342 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.913360 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:43Z","lastTransitionTime":"2026-02-25T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.926644 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.942968 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:43 crc kubenswrapper[4937]: I0225 15:47:43.968883 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:43Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.005731 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.020112 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.020159 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.020170 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.020187 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.020199 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:44Z","lastTransitionTime":"2026-02-25T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.024759 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.040557 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.056686 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.072852 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.092801 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.106978 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.122569 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.122612 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.122625 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.122669 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.122681 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:44Z","lastTransitionTime":"2026-02-25T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.126908 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.146073 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.180840 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.195938 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.218894 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.224955 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.224986 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.224998 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.225015 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.225030 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:44Z","lastTransitionTime":"2026-02-25T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.240036 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.272636 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.295152 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.312572 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.327814 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.327862 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.327878 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.327899 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.327912 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:44Z","lastTransitionTime":"2026-02-25T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.333017 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.350653 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.366883 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.367024 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:44 crc kubenswrapper[4937]: E0225 15:47:44.367213 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:44 crc kubenswrapper[4937]: E0225 15:47:44.367389 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.368994 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.383930 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.400276 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.421712 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.431590 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.431671 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.431688 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.431712 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.431729 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:44Z","lastTransitionTime":"2026-02-25T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.451997 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.534160 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.534191 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.534201 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.534218 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.534230 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:44Z","lastTransitionTime":"2026-02-25T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.637562 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.637670 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.637696 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.637728 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.637751 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:44Z","lastTransitionTime":"2026-02-25T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.741978 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.742035 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.742046 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.742060 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.742070 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:44Z","lastTransitionTime":"2026-02-25T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.844507 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.844548 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.844562 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.844583 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.844597 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:44Z","lastTransitionTime":"2026-02-25T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.920048 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerStarted","Data":"3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5"} Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.920100 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerStarted","Data":"17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d"} Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.920121 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerStarted","Data":"cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12"} Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.922650 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" event={"ID":"3aad09c8-a744-4e42-a270-8cfee256b07f","Type":"ContainerStarted","Data":"6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1"} Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.947617 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.947651 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.947660 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.947678 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.947690 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:44Z","lastTransitionTime":"2026-02-25T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.948513 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.960842 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.980655 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:44 crc kubenswrapper[4937]: I0225 15:47:44.995807 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:44Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.009991 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:45Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.042168 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:45Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.051187 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.051244 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.051256 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.051283 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.051298 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:45Z","lastTransitionTime":"2026-02-25T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.059639 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:45Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.076625 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:45Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.094207 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:45Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.108717 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:45Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.127376 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:45Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.141859 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:45Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.153898 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.153946 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.153959 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.153976 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.153989 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:45Z","lastTransitionTime":"2026-02-25T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.157536 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:45Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.256821 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.256893 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.256912 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.256944 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.256971 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:45Z","lastTransitionTime":"2026-02-25T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.360059 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.360138 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.360149 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.360165 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.360174 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:45Z","lastTransitionTime":"2026-02-25T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.367457 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:45 crc kubenswrapper[4937]: E0225 15:47:45.367605 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.463882 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.463986 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.464001 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.464027 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.464045 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:45Z","lastTransitionTime":"2026-02-25T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.572712 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.572824 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.572838 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.572866 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.572878 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:45Z","lastTransitionTime":"2026-02-25T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.674996 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.675038 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.675049 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.675065 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.675078 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:45Z","lastTransitionTime":"2026-02-25T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.777593 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.777650 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.777662 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.777677 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.777690 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:45Z","lastTransitionTime":"2026-02-25T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.880643 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.880680 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.880690 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.880706 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.880719 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:45Z","lastTransitionTime":"2026-02-25T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.927453 4937 generic.go:334] "Generic (PLEG): container finished" podID="3aad09c8-a744-4e42-a270-8cfee256b07f" containerID="6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1" exitCode=0 Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.927556 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" event={"ID":"3aad09c8-a744-4e42-a270-8cfee256b07f","Type":"ContainerDied","Data":"6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1"} Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.932019 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerStarted","Data":"6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9"} Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.932168 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerStarted","Data":"f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2"} Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.944251 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:45Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.961571 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:45Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.978945 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:45Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.985245 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.985288 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.985301 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.985320 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.985333 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:45Z","lastTransitionTime":"2026-02-25T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:45 crc kubenswrapper[4937]: I0225 15:47:45.994509 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:45Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.009498 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:46Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.028590 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:46Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.043218 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:46Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.053985 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:46Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.070363 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:46Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.090046 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.090091 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.090108 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.090132 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.090143 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:46Z","lastTransitionTime":"2026-02-25T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.094436 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:46Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.108664 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:46Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.124726 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:46Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.140883 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:46Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.194369 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.194423 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.194436 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.194530 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.194551 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:46Z","lastTransitionTime":"2026-02-25T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.298645 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.299121 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.299181 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.299207 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.299262 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:46Z","lastTransitionTime":"2026-02-25T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.367405 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:46 crc kubenswrapper[4937]: E0225 15:47:46.367582 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.367405 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:46 crc kubenswrapper[4937]: E0225 15:47:46.367966 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.402260 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.402310 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.402322 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.402340 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.402352 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:46Z","lastTransitionTime":"2026-02-25T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.505616 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.505691 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.505706 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.505735 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.505749 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:46Z","lastTransitionTime":"2026-02-25T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.609106 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.609202 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.609226 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.609252 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.609270 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:46Z","lastTransitionTime":"2026-02-25T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.711602 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.711642 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.711654 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.711671 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.711682 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:46Z","lastTransitionTime":"2026-02-25T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.814989 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.815595 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.815609 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.815630 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.815647 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:46Z","lastTransitionTime":"2026-02-25T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.917919 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.917961 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.917970 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.917985 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.917994 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:46Z","lastTransitionTime":"2026-02-25T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.936190 4937 generic.go:334] "Generic (PLEG): container finished" podID="3aad09c8-a744-4e42-a270-8cfee256b07f" containerID="48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4" exitCode=0 Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.936223 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" event={"ID":"3aad09c8-a744-4e42-a270-8cfee256b07f","Type":"ContainerDied","Data":"48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4"} Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.952042 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:46Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.968898 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:46Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:46 crc kubenswrapper[4937]: I0225 15:47:46.984117 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:46Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.004604 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.017759 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.020635 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.020670 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.020681 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.020700 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.020714 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:47Z","lastTransitionTime":"2026-02-25T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.041781 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.060666 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.076660 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.096333 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.117109 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.124039 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.124077 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.124089 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.124107 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.124118 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:47Z","lastTransitionTime":"2026-02-25T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.131973 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.148960 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.164333 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.226373 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.226428 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.226441 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.226459 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.226470 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:47Z","lastTransitionTime":"2026-02-25T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.329475 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.329546 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.329557 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.329580 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.329594 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:47Z","lastTransitionTime":"2026-02-25T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.367175 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:47 crc kubenswrapper[4937]: E0225 15:47:47.367353 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.458289 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.458332 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.458342 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.458364 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.458376 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:47Z","lastTransitionTime":"2026-02-25T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.565824 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.565899 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.565920 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.565946 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.565967 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:47Z","lastTransitionTime":"2026-02-25T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.626990 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-6pm6h"] Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.627619 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6pm6h" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.629686 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.630966 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.631195 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.631806 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.645364 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.648106 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fc7de10e-0fed-4e96-871d-e80dbc4134c1-serviceca\") pod \"node-ca-6pm6h\" (UID: \"fc7de10e-0fed-4e96-871d-e80dbc4134c1\") " pod="openshift-image-registry/node-ca-6pm6h" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.648185 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc7de10e-0fed-4e96-871d-e80dbc4134c1-host\") pod \"node-ca-6pm6h\" (UID: \"fc7de10e-0fed-4e96-871d-e80dbc4134c1\") " pod="openshift-image-registry/node-ca-6pm6h" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.648214 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwzjg\" (UniqueName: \"kubernetes.io/projected/fc7de10e-0fed-4e96-871d-e80dbc4134c1-kube-api-access-jwzjg\") pod \"node-ca-6pm6h\" (UID: \"fc7de10e-0fed-4e96-871d-e80dbc4134c1\") " pod="openshift-image-registry/node-ca-6pm6h" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.662510 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.668464 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.668626 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.668641 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.668658 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.668671 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:47Z","lastTransitionTime":"2026-02-25T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.683862 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.701212 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.715661 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.738113 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.749097 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwzjg\" (UniqueName: \"kubernetes.io/projected/fc7de10e-0fed-4e96-871d-e80dbc4134c1-kube-api-access-jwzjg\") pod \"node-ca-6pm6h\" (UID: \"fc7de10e-0fed-4e96-871d-e80dbc4134c1\") " pod="openshift-image-registry/node-ca-6pm6h" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.749173 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fc7de10e-0fed-4e96-871d-e80dbc4134c1-serviceca\") pod \"node-ca-6pm6h\" (UID: \"fc7de10e-0fed-4e96-871d-e80dbc4134c1\") " pod="openshift-image-registry/node-ca-6pm6h" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.749224 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc7de10e-0fed-4e96-871d-e80dbc4134c1-host\") pod \"node-ca-6pm6h\" (UID: \"fc7de10e-0fed-4e96-871d-e80dbc4134c1\") " pod="openshift-image-registry/node-ca-6pm6h" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.749295 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc7de10e-0fed-4e96-871d-e80dbc4134c1-host\") pod \"node-ca-6pm6h\" (UID: \"fc7de10e-0fed-4e96-871d-e80dbc4134c1\") " pod="openshift-image-registry/node-ca-6pm6h" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.750857 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.751295 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fc7de10e-0fed-4e96-871d-e80dbc4134c1-serviceca\") pod \"node-ca-6pm6h\" (UID: \"fc7de10e-0fed-4e96-871d-e80dbc4134c1\") " pod="openshift-image-registry/node-ca-6pm6h" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.771708 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwzjg\" (UniqueName: \"kubernetes.io/projected/fc7de10e-0fed-4e96-871d-e80dbc4134c1-kube-api-access-jwzjg\") pod \"node-ca-6pm6h\" (UID: \"fc7de10e-0fed-4e96-871d-e80dbc4134c1\") " pod="openshift-image-registry/node-ca-6pm6h" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.772183 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.772221 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.772230 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.772246 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.772257 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:47Z","lastTransitionTime":"2026-02-25T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.778431 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.797071 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.809374 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.828873 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.843824 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.857699 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.870391 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.875426 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.875477 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.875535 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.875554 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.875566 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:47Z","lastTransitionTime":"2026-02-25T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.942944 4937 generic.go:334] "Generic (PLEG): container finished" podID="3aad09c8-a744-4e42-a270-8cfee256b07f" containerID="eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b" exitCode=0 Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.943028 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" event={"ID":"3aad09c8-a744-4e42-a270-8cfee256b07f","Type":"ContainerDied","Data":"eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b"} Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.947400 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerStarted","Data":"b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55"} Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.978930 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.978977 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.978991 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.979010 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.979023 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:47Z","lastTransitionTime":"2026-02-25T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.981026 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6pm6h" Feb 25 15:47:47 crc kubenswrapper[4937]: I0225 15:47:47.986261 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:47Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.014922 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.026841 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.040649 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.050736 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.061963 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.073712 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.082183 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.082214 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.082223 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.082237 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.082245 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:48Z","lastTransitionTime":"2026-02-25T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.096907 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.110621 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.122526 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.144037 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.156122 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.168461 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.181439 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.185464 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.185509 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.185520 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.185535 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.185545 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:48Z","lastTransitionTime":"2026-02-25T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.290074 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.290154 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.290171 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.290194 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.290214 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:48Z","lastTransitionTime":"2026-02-25T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.366540 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.366555 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:48 crc kubenswrapper[4937]: E0225 15:47:48.366705 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:48 crc kubenswrapper[4937]: E0225 15:47:48.366801 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.394367 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.394418 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.394437 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.394461 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.394479 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:48Z","lastTransitionTime":"2026-02-25T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.499900 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.499933 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.499943 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.499957 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.499967 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:48Z","lastTransitionTime":"2026-02-25T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.603219 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.603279 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.603330 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.603358 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.603379 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:48Z","lastTransitionTime":"2026-02-25T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.706194 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.706256 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.706277 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.706301 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.706320 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:48Z","lastTransitionTime":"2026-02-25T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.809443 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.809556 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.809583 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.809615 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.809637 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:48Z","lastTransitionTime":"2026-02-25T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.913022 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.913106 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.913125 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.913151 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.913169 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:48Z","lastTransitionTime":"2026-02-25T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.954030 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6pm6h" event={"ID":"fc7de10e-0fed-4e96-871d-e80dbc4134c1","Type":"ContainerStarted","Data":"dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23"} Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.954117 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6pm6h" event={"ID":"fc7de10e-0fed-4e96-871d-e80dbc4134c1","Type":"ContainerStarted","Data":"db260990f89406206562e34f39ec3fb2f0273aba463baa871a27b91406e3d819"} Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.959743 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" event={"ID":"3aad09c8-a744-4e42-a270-8cfee256b07f","Type":"ContainerStarted","Data":"a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597"} Feb 25 15:47:48 crc kubenswrapper[4937]: I0225 15:47:48.978183 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.000279 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:48Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.016284 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.016366 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.016390 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.016419 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.016439 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:49Z","lastTransitionTime":"2026-02-25T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.022225 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.040859 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.060364 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.096215 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.119911 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.119986 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.120039 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.120073 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.120095 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:49Z","lastTransitionTime":"2026-02-25T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.122834 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.140019 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.163957 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.179175 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.213180 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.222454 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.222558 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.222575 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.222601 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.222616 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:49Z","lastTransitionTime":"2026-02-25T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.234293 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.257260 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.272305 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.289848 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.305381 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.324198 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.324247 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.324258 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.324275 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.324288 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:49Z","lastTransitionTime":"2026-02-25T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.336294 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.355239 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.366991 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:49 crc kubenswrapper[4937]: E0225 15:47:49.367175 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.368211 4937 scope.go:117] "RemoveContainer" containerID="74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.373346 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.393084 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.420842 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.426766 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.426801 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.426810 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.426828 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.426840 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:49Z","lastTransitionTime":"2026-02-25T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.432823 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.444830 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.457832 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.469261 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.481758 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.493774 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.508081 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:49Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.530682 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.530730 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.530749 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.530770 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.530786 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:49Z","lastTransitionTime":"2026-02-25T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.633367 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.633444 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.633466 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.633523 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.633542 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:49Z","lastTransitionTime":"2026-02-25T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.736977 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.737037 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.737057 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.737081 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.737099 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:49Z","lastTransitionTime":"2026-02-25T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.839520 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.839571 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.839585 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.839600 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.839614 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:49Z","lastTransitionTime":"2026-02-25T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.942941 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.943025 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.943054 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.943082 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:49 crc kubenswrapper[4937]: I0225 15:47:49.943100 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:49Z","lastTransitionTime":"2026-02-25T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.046646 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.046691 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.046710 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.046734 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.046753 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:50Z","lastTransitionTime":"2026-02-25T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.150134 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.150185 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.150197 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.150216 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.150228 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:50Z","lastTransitionTime":"2026-02-25T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.253682 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.253742 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.253764 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.253791 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.253813 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:50Z","lastTransitionTime":"2026-02-25T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.357081 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.357121 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.357130 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.357146 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.357156 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:50Z","lastTransitionTime":"2026-02-25T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.366765 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.366845 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:50 crc kubenswrapper[4937]: E0225 15:47:50.366957 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:50 crc kubenswrapper[4937]: E0225 15:47:50.368057 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.460371 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.460431 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.460448 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.460474 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.460513 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:50Z","lastTransitionTime":"2026-02-25T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.564170 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.564211 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.564227 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.564248 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.564262 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:50Z","lastTransitionTime":"2026-02-25T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.668092 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.668137 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.668152 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.668174 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.668187 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:50Z","lastTransitionTime":"2026-02-25T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.770658 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.770716 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.770733 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.770761 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.770777 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:50Z","lastTransitionTime":"2026-02-25T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.860644 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.860741 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.860757 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.860784 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.860802 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:50Z","lastTransitionTime":"2026-02-25T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: E0225 15:47:50.886382 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:50Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.892318 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.892371 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.892388 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.892414 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.892432 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:50Z","lastTransitionTime":"2026-02-25T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: E0225 15:47:50.921534 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:50Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.927171 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.927194 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.927202 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.927216 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.927225 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:50Z","lastTransitionTime":"2026-02-25T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: E0225 15:47:50.942793 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:50Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.950957 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.950975 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.950984 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.950997 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.951007 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:50Z","lastTransitionTime":"2026-02-25T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: E0225 15:47:50.968933 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:50Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.972674 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.972723 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.972737 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.972755 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.972771 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:50Z","lastTransitionTime":"2026-02-25T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.977972 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerStarted","Data":"53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff"} Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.979276 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.979324 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.979415 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.984920 4937 generic.go:334] "Generic (PLEG): container finished" podID="3aad09c8-a744-4e42-a270-8cfee256b07f" containerID="a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597" exitCode=0 Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.985005 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" event={"ID":"3aad09c8-a744-4e42-a270-8cfee256b07f","Type":"ContainerDied","Data":"a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597"} Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.987194 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 15:47:50 crc kubenswrapper[4937]: E0225 15:47:50.992144 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:50Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:50 crc kubenswrapper[4937]: E0225 15:47:50.992364 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.994095 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.994126 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.994144 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.994169 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.994185 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:50Z","lastTransitionTime":"2026-02-25T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.994562 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52"} Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.995383 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:47:50 crc kubenswrapper[4937]: I0225 15:47:50.995409 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:50Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.017886 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.022443 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.025225 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.033259 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.045940 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.059154 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.081146 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.096296 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.103570 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.104207 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.104224 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.104245 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.104257 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:51Z","lastTransitionTime":"2026-02-25T15:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.117471 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.142575 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.157286 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.170130 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.198102 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.207200 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.207226 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.207236 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.207254 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.207266 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:47:51Z","lastTransitionTime":"2026-02-25T15:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.211006 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.228578 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.249780 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.272948 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.289382 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: E0225 15:47:51.309392 4937 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.318601 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.338700 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.352886 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.366084 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.370050 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:51 crc kubenswrapper[4937]: E0225 15:47:51.370194 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.387017 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.404381 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.419361 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.431249 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.441948 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.459238 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.472971 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.484994 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.497683 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.508666 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.528815 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.540729 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.560711 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.580958 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.592864 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.610679 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.620018 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.639281 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.651368 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.662774 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:51 crc kubenswrapper[4937]: I0225 15:47:51.672419 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:52 crc kubenswrapper[4937]: I0225 15:47:52.366842 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:52 crc kubenswrapper[4937]: E0225 15:47:52.367018 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:52 crc kubenswrapper[4937]: I0225 15:47:52.367107 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:52 crc kubenswrapper[4937]: E0225 15:47:52.367236 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:52 crc kubenswrapper[4937]: E0225 15:47:52.561554 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.006362 4937 generic.go:334] "Generic (PLEG): container finished" podID="3aad09c8-a744-4e42-a270-8cfee256b07f" containerID="9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628" exitCode=0 Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.006419 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" event={"ID":"3aad09c8-a744-4e42-a270-8cfee256b07f","Type":"ContainerDied","Data":"9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628"} Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.018358 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.030020 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.052775 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.078220 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.094979 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.112030 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.128558 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.142685 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.167326 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.187244 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.200374 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.212751 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.241574 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.259108 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.367306 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:53 crc kubenswrapper[4937]: E0225 15:47:53.367619 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.657603 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg"] Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.658774 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.662380 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.662473 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.679679 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.698468 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.717124 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjdn2\" (UniqueName: \"kubernetes.io/projected/5c923347-0d7f-4647-a3d3-1a0e5e68daaf-kube-api-access-zjdn2\") pod \"ovnkube-control-plane-749d76644c-2mjkg\" (UID: \"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.717189 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c923347-0d7f-4647-a3d3-1a0e5e68daaf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2mjkg\" (UID: \"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.717204 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c923347-0d7f-4647-a3d3-1a0e5e68daaf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2mjkg\" (UID: \"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.717220 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c923347-0d7f-4647-a3d3-1a0e5e68daaf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2mjkg\" (UID: \"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.732038 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.753894 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.769382 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.791864 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.807509 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.818574 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c923347-0d7f-4647-a3d3-1a0e5e68daaf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2mjkg\" (UID: \"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.818646 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c923347-0d7f-4647-a3d3-1a0e5e68daaf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2mjkg\" (UID: \"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.818701 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c923347-0d7f-4647-a3d3-1a0e5e68daaf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2mjkg\" (UID: \"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.818830 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjdn2\" (UniqueName: \"kubernetes.io/projected/5c923347-0d7f-4647-a3d3-1a0e5e68daaf-kube-api-access-zjdn2\") pod \"ovnkube-control-plane-749d76644c-2mjkg\" (UID: \"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.820278 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c923347-0d7f-4647-a3d3-1a0e5e68daaf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2mjkg\" (UID: \"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.821459 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c923347-0d7f-4647-a3d3-1a0e5e68daaf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2mjkg\" (UID: \"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.821932 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.827396 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c923347-0d7f-4647-a3d3-1a0e5e68daaf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2mjkg\" (UID: \"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.845942 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjdn2\" (UniqueName: \"kubernetes.io/projected/5c923347-0d7f-4647-a3d3-1a0e5e68daaf-kube-api-access-zjdn2\") pod \"ovnkube-control-plane-749d76644c-2mjkg\" (UID: \"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.892328 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.909853 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.940151 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.972901 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" Feb 25 15:47:53 crc kubenswrapper[4937]: I0225 15:47:53.975418 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:53 crc kubenswrapper[4937]: W0225 15:47:53.986674 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c923347_0d7f_4647_a3d3_1a0e5e68daaf.slice/crio-8a00b17db8b9bb0d75c3fb6446df00f17c8b5687fc454f8258eb61d9a820ea0b WatchSource:0}: Error finding container 8a00b17db8b9bb0d75c3fb6446df00f17c8b5687fc454f8258eb61d9a820ea0b: Status 404 returned error can't find the container with id 8a00b17db8b9bb0d75c3fb6446df00f17c8b5687fc454f8258eb61d9a820ea0b Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.001303 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.009746 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" event={"ID":"5c923347-0d7f-4647-a3d3-1a0e5e68daaf","Type":"ContainerStarted","Data":"8a00b17db8b9bb0d75c3fb6446df00f17c8b5687fc454f8258eb61d9a820ea0b"} Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.012960 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" event={"ID":"3aad09c8-a744-4e42-a270-8cfee256b07f","Type":"ContainerStarted","Data":"207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b"} Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.014887 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.029781 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.042319 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.054792 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.070421 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.082774 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.097994 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.124039 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.143555 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.159291 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.172027 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.181896 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.205889 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.219070 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.232142 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.250185 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.263993 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.367052 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.367106 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:54 crc kubenswrapper[4937]: E0225 15:47:54.367244 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:54 crc kubenswrapper[4937]: E0225 15:47:54.367403 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.383176 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.423192 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-sz7zh"] Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.425031 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:47:54 crc kubenswrapper[4937]: E0225 15:47:54.425329 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.446258 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.479256 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.493689 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.506064 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.520743 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.526368 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs742\" (UniqueName: \"kubernetes.io/projected/f125006f-2b26-4ffe-ac0d-dc756f48b067-kube-api-access-bs742\") pod \"network-metrics-daemon-sz7zh\" (UID: \"f125006f-2b26-4ffe-ac0d-dc756f48b067\") " pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.526755 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs\") pod \"network-metrics-daemon-sz7zh\" (UID: \"f125006f-2b26-4ffe-ac0d-dc756f48b067\") " pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.542658 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.554282 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.569824 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.595043 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.613394 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.627960 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs\") pod \"network-metrics-daemon-sz7zh\" (UID: \"f125006f-2b26-4ffe-ac0d-dc756f48b067\") " pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.628089 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs742\" (UniqueName: \"kubernetes.io/projected/f125006f-2b26-4ffe-ac0d-dc756f48b067-kube-api-access-bs742\") pod \"network-metrics-daemon-sz7zh\" (UID: \"f125006f-2b26-4ffe-ac0d-dc756f48b067\") " pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:47:54 crc kubenswrapper[4937]: E0225 15:47:54.628217 4937 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:47:54 crc kubenswrapper[4937]: E0225 15:47:54.628311 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs podName:f125006f-2b26-4ffe-ac0d-dc756f48b067 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:55.128280809 +0000 UTC m=+126.141672709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs") pod "network-metrics-daemon-sz7zh" (UID: "f125006f-2b26-4ffe-ac0d-dc756f48b067") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.634011 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.650537 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.660660 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs742\" (UniqueName: \"kubernetes.io/projected/f125006f-2b26-4ffe-ac0d-dc756f48b067-kube-api-access-bs742\") pod \"network-metrics-daemon-sz7zh\" (UID: \"f125006f-2b26-4ffe-ac0d-dc756f48b067\") " pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.670788 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.705416 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.723446 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.737808 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:54 crc kubenswrapper[4937]: I0225 15:47:54.754936 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:54Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:55 crc kubenswrapper[4937]: I0225 15:47:55.133563 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs\") pod \"network-metrics-daemon-sz7zh\" (UID: \"f125006f-2b26-4ffe-ac0d-dc756f48b067\") " pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:47:55 crc kubenswrapper[4937]: E0225 15:47:55.133843 4937 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:47:55 crc kubenswrapper[4937]: E0225 15:47:55.133964 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs podName:f125006f-2b26-4ffe-ac0d-dc756f48b067 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:56.133927813 +0000 UTC m=+127.147319743 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs") pod "network-metrics-daemon-sz7zh" (UID: "f125006f-2b26-4ffe-ac0d-dc756f48b067") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:47:55 crc kubenswrapper[4937]: I0225 15:47:55.367696 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:55 crc kubenswrapper[4937]: E0225 15:47:55.367899 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:56 crc kubenswrapper[4937]: I0225 15:47:56.022751 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" event={"ID":"5c923347-0d7f-4647-a3d3-1a0e5e68daaf","Type":"ContainerStarted","Data":"87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969"} Feb 25 15:47:56 crc kubenswrapper[4937]: I0225 15:47:56.142877 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs\") pod \"network-metrics-daemon-sz7zh\" (UID: \"f125006f-2b26-4ffe-ac0d-dc756f48b067\") " pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:47:56 crc kubenswrapper[4937]: E0225 15:47:56.143060 4937 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:47:56 crc kubenswrapper[4937]: E0225 15:47:56.143112 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs podName:f125006f-2b26-4ffe-ac0d-dc756f48b067 nodeName:}" failed. No retries permitted until 2026-02-25 15:47:58.143097628 +0000 UTC m=+129.156489528 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs") pod "network-metrics-daemon-sz7zh" (UID: "f125006f-2b26-4ffe-ac0d-dc756f48b067") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:47:56 crc kubenswrapper[4937]: I0225 15:47:56.366588 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:56 crc kubenswrapper[4937]: E0225 15:47:56.366973 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:56 crc kubenswrapper[4937]: I0225 15:47:56.366685 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:47:56 crc kubenswrapper[4937]: E0225 15:47:56.367370 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:47:56 crc kubenswrapper[4937]: I0225 15:47:56.366621 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:56 crc kubenswrapper[4937]: E0225 15:47:56.367583 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:57 crc kubenswrapper[4937]: I0225 15:47:57.367199 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:57 crc kubenswrapper[4937]: E0225 15:47:57.367346 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:57 crc kubenswrapper[4937]: E0225 15:47:57.562298 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.031003 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" event={"ID":"5c923347-0d7f-4647-a3d3-1a0e5e68daaf","Type":"ContainerStarted","Data":"de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7"} Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.033436 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/0.log" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.036929 4937 generic.go:334] "Generic (PLEG): container finished" podID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerID="53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff" exitCode=1 Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.036990 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerDied","Data":"53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff"} Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.037830 4937 scope.go:117] "RemoveContainer" containerID="53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.051986 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.073825 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.092203 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.125696 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.151780 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.165141 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.165323 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs\") pod \"network-metrics-daemon-sz7zh\" (UID: \"f125006f-2b26-4ffe-ac0d-dc756f48b067\") " pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:47:58 crc kubenswrapper[4937]: E0225 15:47:58.165457 4937 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:47:58 crc kubenswrapper[4937]: E0225 15:47:58.165584 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs podName:f125006f-2b26-4ffe-ac0d-dc756f48b067 nodeName:}" failed. No retries permitted until 2026-02-25 15:48:02.165562832 +0000 UTC m=+133.178954722 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs") pod "network-metrics-daemon-sz7zh" (UID: "f125006f-2b26-4ffe-ac0d-dc756f48b067") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.179182 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.210178 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.224926 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.240303 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.255491 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.265447 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.277657 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.287033 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.300044 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.313135 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.327465 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.341226 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.355057 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.366467 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.366674 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.366723 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.366801 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:47:58 crc kubenswrapper[4937]: E0225 15:47:58.366900 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:47:58 crc kubenswrapper[4937]: E0225 15:47:58.366836 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:47:58 crc kubenswrapper[4937]: E0225 15:47:58.367082 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.379792 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.392436 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.405498 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.433791 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:47:57Z\\\",\\\"message\\\":\\\"client-go/informers/factory.go:160\\\\nI0225 15:47:57.362276 6741 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0225 15:47:57.362628 6741 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0225 15:47:57.362647 6741 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0225 15:47:57.362673 6741 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0225 15:47:57.362678 6741 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0225 15:47:57.362697 6741 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0225 15:47:57.362732 6741 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0225 15:47:57.362757 6741 factory.go:656] Stopping watch factory\\\\nI0225 15:47:57.362769 6741 ovnkube.go:599] Stopped ovnkube\\\\nI0225 15:47:57.362792 6741 handler.go:208] Removed *v1.Node event handler 2\\\\nI0225 15:47:57.362800 6741 handler.go:208] Removed *v1.Node event handler 7\\\\nI0225 15:47:57.362806 6741 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0225 15:47:57.362813 6741 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0225 15:47:57.362821 6741 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0225 15:47:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.447084 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.459887 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.476151 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.489125 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.508215 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.521184 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.534875 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.549264 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.564723 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:58 crc kubenswrapper[4937]: I0225 15:47:58.581171 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:58Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.042228 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/0.log" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.045281 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerStarted","Data":"ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314"} Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.061204 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.072664 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.093597 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:47:57Z\\\",\\\"message\\\":\\\"client-go/informers/factory.go:160\\\\nI0225 15:47:57.362276 6741 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0225 15:47:57.362628 6741 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0225 15:47:57.362647 6741 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0225 15:47:57.362673 6741 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0225 15:47:57.362678 6741 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0225 15:47:57.362697 6741 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0225 15:47:57.362732 6741 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0225 15:47:57.362757 6741 factory.go:656] Stopping watch factory\\\\nI0225 15:47:57.362769 6741 ovnkube.go:599] Stopped ovnkube\\\\nI0225 15:47:57.362792 6741 handler.go:208] Removed *v1.Node event handler 2\\\\nI0225 15:47:57.362800 6741 handler.go:208] Removed *v1.Node event handler 7\\\\nI0225 15:47:57.362806 6741 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0225 15:47:57.362813 6741 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0225 15:47:57.362821 6741 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0225 15:47:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.106683 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.118322 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.132904 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.150859 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.180449 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.197631 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.216513 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.232357 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.243322 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.255533 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.267319 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.283457 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.299524 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.317618 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:47:59Z is after 2025-08-24T17:21:41Z" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.367797 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:47:59 crc kubenswrapper[4937]: E0225 15:47:59.367992 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.883227 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.883443 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:59 crc kubenswrapper[4937]: E0225 15:47:59.883579 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:48:31.883539347 +0000 UTC m=+162.896931277 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:47:59 crc kubenswrapper[4937]: E0225 15:47:59.883673 4937 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:47:59 crc kubenswrapper[4937]: I0225 15:47:59.883750 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:47:59 crc kubenswrapper[4937]: E0225 15:47:59.883789 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:48:31.883760102 +0000 UTC m=+162.897152032 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:47:59 crc kubenswrapper[4937]: E0225 15:47:59.883980 4937 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:47:59 crc kubenswrapper[4937]: E0225 15:47:59.884050 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:48:31.884036009 +0000 UTC m=+162.897427929 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:47:59.984741 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:47:59.984859 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:00 crc kubenswrapper[4937]: E0225 15:47:59.984941 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:48:00 crc kubenswrapper[4937]: E0225 15:47:59.984968 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:48:00 crc kubenswrapper[4937]: E0225 15:47:59.984982 4937 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:48:00 crc kubenswrapper[4937]: E0225 15:47:59.985010 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:48:00 crc kubenswrapper[4937]: E0225 15:47:59.985034 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:48:00 crc kubenswrapper[4937]: E0225 15:47:59.985046 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 15:48:31.985028121 +0000 UTC m=+162.998420091 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:48:00 crc kubenswrapper[4937]: E0225 15:47:59.985050 4937 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:48:00 crc kubenswrapper[4937]: E0225 15:47:59.985104 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 15:48:31.985088283 +0000 UTC m=+162.998480203 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.050178 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/1.log" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.050820 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/0.log" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.054032 4937 generic.go:334] "Generic (PLEG): container finished" podID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerID="ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314" exitCode=1 Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.054075 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerDied","Data":"ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314"} Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.054111 4937 scope.go:117] "RemoveContainer" containerID="53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.054818 4937 scope.go:117] "RemoveContainer" containerID="ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314" Feb 25 15:48:00 crc kubenswrapper[4937]: E0225 15:48:00.054993 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.074860 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.094204 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.130979 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:47:57Z\\\",\\\"message\\\":\\\"client-go/informers/factory.go:160\\\\nI0225 15:47:57.362276 6741 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0225 15:47:57.362628 6741 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0225 15:47:57.362647 6741 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0225 15:47:57.362673 6741 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0225 15:47:57.362678 6741 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0225 15:47:57.362697 6741 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0225 15:47:57.362732 6741 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0225 15:47:57.362757 6741 factory.go:656] Stopping watch factory\\\\nI0225 15:47:57.362769 6741 ovnkube.go:599] Stopped ovnkube\\\\nI0225 15:47:57.362792 6741 handler.go:208] Removed *v1.Node event handler 2\\\\nI0225 15:47:57.362800 6741 handler.go:208] Removed *v1.Node event handler 7\\\\nI0225 15:47:57.362806 6741 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0225 15:47:57.362813 6741 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0225 15:47:57.362821 6741 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0225 15:47:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:47:59Z\\\",\\\"message\\\":\\\"hift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"3ec9f67e-7758-4707-a6d0-2dc28f28ac37\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0225 15:47:59.213573 7013 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.151862 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.166273 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.189320 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.218424 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.242208 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.257407 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.278328 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.296360 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.310611 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.330354 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.347728 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.364972 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.366827 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.366892 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.366840 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:00 crc kubenswrapper[4937]: E0225 15:48:00.367018 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:00 crc kubenswrapper[4937]: E0225 15:48:00.367199 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:00 crc kubenswrapper[4937]: E0225 15:48:00.367446 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.383975 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.400435 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.621867 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.639348 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.657136 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.669810 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.688686 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.704805 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.721963 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.752071 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:47:57Z\\\",\\\"message\\\":\\\"client-go/informers/factory.go:160\\\\nI0225 15:47:57.362276 6741 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0225 15:47:57.362628 6741 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0225 15:47:57.362647 6741 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0225 15:47:57.362673 6741 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0225 15:47:57.362678 6741 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0225 15:47:57.362697 6741 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0225 15:47:57.362732 6741 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0225 15:47:57.362757 6741 factory.go:656] Stopping watch factory\\\\nI0225 15:47:57.362769 6741 ovnkube.go:599] Stopped ovnkube\\\\nI0225 15:47:57.362792 6741 handler.go:208] Removed *v1.Node event handler 2\\\\nI0225 15:47:57.362800 6741 handler.go:208] Removed *v1.Node event handler 7\\\\nI0225 15:47:57.362806 6741 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0225 15:47:57.362813 6741 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0225 15:47:57.362821 6741 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0225 15:47:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:47:59Z\\\",\\\"message\\\":\\\"hift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"3ec9f67e-7758-4707-a6d0-2dc28f28ac37\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0225 15:47:59.213573 7013 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.774139 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.789047 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.810858 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.826113 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.843070 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.858227 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.902342 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.917004 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.931110 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:00 crc kubenswrapper[4937]: I0225 15:48:00.949180 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:00Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.061100 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/1.log" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.364965 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.365170 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.365196 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.365222 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.365239 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:01Z","lastTransitionTime":"2026-02-25T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.367769 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:01 crc kubenswrapper[4937]: E0225 15:48:01.368753 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:01 crc kubenswrapper[4937]: E0225 15:48:01.382250 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.386473 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.386601 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.386622 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.386645 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.386668 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:01Z","lastTransitionTime":"2026-02-25T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.395139 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: E0225 15:48:01.409195 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.413391 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.413722 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.413774 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.413790 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.414006 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.414023 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:01Z","lastTransitionTime":"2026-02-25T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.429909 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: E0225 15:48:01.433828 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.438476 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.438702 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.438733 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.438764 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.438787 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:01Z","lastTransitionTime":"2026-02-25T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.445795 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: E0225 15:48:01.461074 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.461678 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.465467 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.465527 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.465623 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.465650 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.465667 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:01Z","lastTransitionTime":"2026-02-25T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:01 crc kubenswrapper[4937]: E0225 15:48:01.481736 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: E0225 15:48:01.481844 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.489161 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.500608 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.516779 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.530173 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.545143 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.562712 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:47:57Z\\\",\\\"message\\\":\\\"client-go/informers/factory.go:160\\\\nI0225 15:47:57.362276 6741 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0225 15:47:57.362628 6741 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0225 15:47:57.362647 6741 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0225 15:47:57.362673 6741 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0225 15:47:57.362678 6741 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0225 15:47:57.362697 6741 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0225 15:47:57.362732 6741 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0225 15:47:57.362757 6741 factory.go:656] Stopping watch factory\\\\nI0225 15:47:57.362769 6741 ovnkube.go:599] Stopped ovnkube\\\\nI0225 15:47:57.362792 6741 handler.go:208] Removed *v1.Node event handler 2\\\\nI0225 15:47:57.362800 6741 handler.go:208] Removed *v1.Node event handler 7\\\\nI0225 15:47:57.362806 6741 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0225 15:47:57.362813 6741 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0225 15:47:57.362821 6741 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0225 15:47:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:47:59Z\\\",\\\"message\\\":\\\"hift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"3ec9f67e-7758-4707-a6d0-2dc28f28ac37\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0225 15:47:59.213573 7013 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.573416 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.586304 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.597987 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.612048 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.623042 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:01 crc kubenswrapper[4937]: I0225 15:48:01.637074 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:01Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:02 crc kubenswrapper[4937]: I0225 15:48:02.207309 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs\") pod \"network-metrics-daemon-sz7zh\" (UID: \"f125006f-2b26-4ffe-ac0d-dc756f48b067\") " pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:02 crc kubenswrapper[4937]: E0225 15:48:02.207555 4937 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:48:02 crc kubenswrapper[4937]: E0225 15:48:02.207624 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs podName:f125006f-2b26-4ffe-ac0d-dc756f48b067 nodeName:}" failed. No retries permitted until 2026-02-25 15:48:10.207603434 +0000 UTC m=+141.220995354 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs") pod "network-metrics-daemon-sz7zh" (UID: "f125006f-2b26-4ffe-ac0d-dc756f48b067") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:48:02 crc kubenswrapper[4937]: I0225 15:48:02.366728 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:02 crc kubenswrapper[4937]: I0225 15:48:02.366767 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:02 crc kubenswrapper[4937]: E0225 15:48:02.366843 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:02 crc kubenswrapper[4937]: I0225 15:48:02.366853 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:02 crc kubenswrapper[4937]: E0225 15:48:02.366937 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:02 crc kubenswrapper[4937]: E0225 15:48:02.367032 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:02 crc kubenswrapper[4937]: E0225 15:48:02.603207 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:48:03 crc kubenswrapper[4937]: I0225 15:48:03.367651 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:03 crc kubenswrapper[4937]: E0225 15:48:03.368055 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:03 crc kubenswrapper[4937]: I0225 15:48:03.381144 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 25 15:48:04 crc kubenswrapper[4937]: I0225 15:48:04.366892 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:04 crc kubenswrapper[4937]: I0225 15:48:04.367008 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:04 crc kubenswrapper[4937]: I0225 15:48:04.367067 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:04 crc kubenswrapper[4937]: E0225 15:48:04.367065 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:04 crc kubenswrapper[4937]: E0225 15:48:04.367183 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:04 crc kubenswrapper[4937]: E0225 15:48:04.367293 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:05 crc kubenswrapper[4937]: I0225 15:48:05.367423 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:05 crc kubenswrapper[4937]: E0225 15:48:05.367601 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:06 crc kubenswrapper[4937]: I0225 15:48:06.367253 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:06 crc kubenswrapper[4937]: I0225 15:48:06.367386 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:06 crc kubenswrapper[4937]: I0225 15:48:06.367444 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:06 crc kubenswrapper[4937]: E0225 15:48:06.367529 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:06 crc kubenswrapper[4937]: E0225 15:48:06.367624 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:06 crc kubenswrapper[4937]: E0225 15:48:06.367717 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:07 crc kubenswrapper[4937]: I0225 15:48:07.366925 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:07 crc kubenswrapper[4937]: E0225 15:48:07.367181 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:07 crc kubenswrapper[4937]: E0225 15:48:07.604137 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:48:08 crc kubenswrapper[4937]: I0225 15:48:08.366843 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:08 crc kubenswrapper[4937]: I0225 15:48:08.366911 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:08 crc kubenswrapper[4937]: I0225 15:48:08.367026 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:08 crc kubenswrapper[4937]: E0225 15:48:08.367257 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:08 crc kubenswrapper[4937]: E0225 15:48:08.367450 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:08 crc kubenswrapper[4937]: E0225 15:48:08.367577 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:09 crc kubenswrapper[4937]: I0225 15:48:09.367033 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:09 crc kubenswrapper[4937]: E0225 15:48:09.367265 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:10 crc kubenswrapper[4937]: I0225 15:48:10.304412 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs\") pod \"network-metrics-daemon-sz7zh\" (UID: \"f125006f-2b26-4ffe-ac0d-dc756f48b067\") " pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:10 crc kubenswrapper[4937]: E0225 15:48:10.304601 4937 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:48:10 crc kubenswrapper[4937]: E0225 15:48:10.304664 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs podName:f125006f-2b26-4ffe-ac0d-dc756f48b067 nodeName:}" failed. No retries permitted until 2026-02-25 15:48:26.304647929 +0000 UTC m=+157.318039819 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs") pod "network-metrics-daemon-sz7zh" (UID: "f125006f-2b26-4ffe-ac0d-dc756f48b067") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:48:10 crc kubenswrapper[4937]: I0225 15:48:10.367271 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:10 crc kubenswrapper[4937]: E0225 15:48:10.367387 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:10 crc kubenswrapper[4937]: I0225 15:48:10.367560 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:10 crc kubenswrapper[4937]: E0225 15:48:10.367617 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:10 crc kubenswrapper[4937]: I0225 15:48:10.367737 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:10 crc kubenswrapper[4937]: E0225 15:48:10.367901 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.366868 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:11 crc kubenswrapper[4937]: E0225 15:48:11.367021 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.368109 4937 scope.go:117] "RemoveContainer" containerID="ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.381712 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.396626 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.415222 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.431139 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.448069 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.465706 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.486003 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.500209 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.515163 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.527904 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2967b0de-f792-4dab-bbd8-f642a206b4c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.539458 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.539770 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.539957 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.540107 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.540241 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:11Z","lastTransitionTime":"2026-02-25T15:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.542858 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.561645 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: E0225 15:48:11.573555 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.577612 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.577748 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.577809 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.577875 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.577945 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:11Z","lastTransitionTime":"2026-02-25T15:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.598649 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e1110e953c625cbf6afc769043014872383fb22425c9880a34777fd20402ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:47:57Z\\\",\\\"message\\\":\\\"client-go/informers/factory.go:160\\\\nI0225 15:47:57.362276 6741 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0225 15:47:57.362628 6741 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0225 15:47:57.362647 6741 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0225 15:47:57.362673 6741 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0225 15:47:57.362678 6741 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0225 15:47:57.362697 6741 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0225 15:47:57.362732 6741 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0225 15:47:57.362757 6741 factory.go:656] Stopping watch factory\\\\nI0225 15:47:57.362769 6741 ovnkube.go:599] Stopped ovnkube\\\\nI0225 15:47:57.362792 6741 handler.go:208] Removed *v1.Node event handler 2\\\\nI0225 15:47:57.362800 6741 handler.go:208] Removed *v1.Node event handler 7\\\\nI0225 15:47:57.362806 6741 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0225 15:47:57.362813 6741 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0225 15:47:57.362821 6741 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0225 15:47:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:47:59Z\\\",\\\"message\\\":\\\"hift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"3ec9f67e-7758-4707-a6d0-2dc28f28ac37\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0225 15:47:59.213573 7013 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: E0225 15:48:11.616220 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.620854 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.620882 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.620890 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.620904 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.620915 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:11Z","lastTransitionTime":"2026-02-25T15:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.623375 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: E0225 15:48:11.642117 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.642331 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.647167 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.647223 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.647239 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.647259 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.647272 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:11Z","lastTransitionTime":"2026-02-25T15:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.659766 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: E0225 15:48:11.660284 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.663651 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.663679 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.663689 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.663707 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.663718 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:11Z","lastTransitionTime":"2026-02-25T15:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:11 crc kubenswrapper[4937]: E0225 15:48:11.677272 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: E0225 15:48:11.677450 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.678755 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.690771 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.702590 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.714290 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.727647 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.744152 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:47:59Z\\\",\\\"message\\\":\\\"hift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"3ec9f67e-7758-4707-a6d0-2dc28f28ac37\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0225 15:47:59.213573 7013 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.757537 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.767476 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.782780 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.807610 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.824380 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.833507 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.844454 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.860911 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.878183 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.893554 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.908316 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.925267 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2967b0de-f792-4dab-bbd8-f642a206b4c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.937287 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.954845 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:11 crc kubenswrapper[4937]: I0225 15:48:11.975102 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:11Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.104744 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/1.log" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.106934 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerStarted","Data":"b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54"} Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.107326 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.120999 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.133205 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.142745 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.169715 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.186237 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.205128 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.220446 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.238705 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2967b0de-f792-4dab-bbd8-f642a206b4c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.258888 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.282794 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.298334 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.310007 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.324785 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.340831 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.361791 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:47:59Z\\\",\\\"message\\\":\\\"hift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"3ec9f67e-7758-4707-a6d0-2dc28f28ac37\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0225 15:47:59.213573 7013 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:48:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.366952 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.367025 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.367065 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:12 crc kubenswrapper[4937]: E0225 15:48:12.367204 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:12 crc kubenswrapper[4937]: E0225 15:48:12.367285 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:12 crc kubenswrapper[4937]: E0225 15:48:12.367405 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.378709 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.393737 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: I0225 15:48:12.413680 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:12Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:12 crc kubenswrapper[4937]: E0225 15:48:12.605506 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.113315 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/2.log" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.115072 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/1.log" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.118660 4937 generic.go:334] "Generic (PLEG): container finished" podID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerID="b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54" exitCode=1 Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.118725 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerDied","Data":"b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54"} Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.118793 4937 scope.go:117] "RemoveContainer" containerID="ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.120301 4937 scope.go:117] "RemoveContainer" containerID="b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54" Feb 25 15:48:13 crc kubenswrapper[4937]: E0225 15:48:13.120728 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.135569 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.150172 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.161189 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.174124 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.184965 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.204281 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.216699 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.233736 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.247749 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2967b0de-f792-4dab-bbd8-f642a206b4c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.262982 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.275992 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.297985 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad769aa91dcdc1cc2b0fa8ab3276c891877c0e2b911bf5775b05ae44a1a92314\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:47:59Z\\\",\\\"message\\\":\\\"hift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"3ec9f67e-7758-4707-a6d0-2dc28f28ac37\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0225 15:47:59.213573 7013 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:12Z\\\",\\\"message\\\":\\\"rk-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592206 7189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592263 7189 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592431 7189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592584 7189 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592655 7189 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592623 7189 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592770 7189 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592952 7189 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:48:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.312768 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.325942 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.337946 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.353911 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.364674 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.367018 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:13 crc kubenswrapper[4937]: E0225 15:48:13.367186 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:13 crc kubenswrapper[4937]: I0225 15:48:13.387741 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:13Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.125003 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/2.log" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.131010 4937 scope.go:117] "RemoveContainer" containerID="b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54" Feb 25 15:48:14 crc kubenswrapper[4937]: E0225 15:48:14.131337 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.147407 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2967b0de-f792-4dab-bbd8-f642a206b4c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.167542 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.185347 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.205400 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.227019 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.253066 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.269845 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.303737 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:12Z\\\",\\\"message\\\":\\\"rk-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592206 7189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592263 7189 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592431 7189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592584 7189 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592655 7189 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592623 7189 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592770 7189 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592952 7189 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:48:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.325349 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.340461 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.364279 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.366987 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.367065 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.366998 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:14 crc kubenswrapper[4937]: E0225 15:48:14.367141 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:14 crc kubenswrapper[4937]: E0225 15:48:14.367249 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:14 crc kubenswrapper[4937]: E0225 15:48:14.367390 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.380673 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.416002 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.435796 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.452244 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.468972 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.486265 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:14 crc kubenswrapper[4937]: I0225 15:48:14.503428 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:14Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:15 crc kubenswrapper[4937]: I0225 15:48:15.367300 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:15 crc kubenswrapper[4937]: E0225 15:48:15.367479 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:16 crc kubenswrapper[4937]: I0225 15:48:16.367052 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:16 crc kubenswrapper[4937]: I0225 15:48:16.367175 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:16 crc kubenswrapper[4937]: I0225 15:48:16.367259 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:16 crc kubenswrapper[4937]: E0225 15:48:16.367192 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:16 crc kubenswrapper[4937]: E0225 15:48:16.367357 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:16 crc kubenswrapper[4937]: E0225 15:48:16.367539 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:17 crc kubenswrapper[4937]: I0225 15:48:17.366951 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:17 crc kubenswrapper[4937]: E0225 15:48:17.367112 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:17 crc kubenswrapper[4937]: E0225 15:48:17.607037 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:48:18 crc kubenswrapper[4937]: I0225 15:48:18.366986 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:18 crc kubenswrapper[4937]: I0225 15:48:18.367045 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:18 crc kubenswrapper[4937]: I0225 15:48:18.367015 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:18 crc kubenswrapper[4937]: E0225 15:48:18.367166 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:18 crc kubenswrapper[4937]: E0225 15:48:18.367318 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:18 crc kubenswrapper[4937]: E0225 15:48:18.367557 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:19 crc kubenswrapper[4937]: I0225 15:48:19.367625 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:19 crc kubenswrapper[4937]: E0225 15:48:19.368828 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:20 crc kubenswrapper[4937]: I0225 15:48:20.367464 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:20 crc kubenswrapper[4937]: I0225 15:48:20.367525 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:20 crc kubenswrapper[4937]: E0225 15:48:20.367670 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:20 crc kubenswrapper[4937]: I0225 15:48:20.367739 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:20 crc kubenswrapper[4937]: E0225 15:48:20.367956 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:20 crc kubenswrapper[4937]: E0225 15:48:20.368017 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.367844 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:21 crc kubenswrapper[4937]: E0225 15:48:21.368043 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.405105 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.421803 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.441690 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.458945 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.473232 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.488765 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.505863 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.524745 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2967b0de-f792-4dab-bbd8-f642a206b4c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.539348 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.551653 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.564008 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.576767 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.591009 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.605861 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.635900 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:12Z\\\",\\\"message\\\":\\\"rk-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592206 7189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592263 7189 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592431 7189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592584 7189 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592655 7189 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592623 7189 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592770 7189 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592952 7189 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:48:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.653295 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.668215 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.687378 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.749313 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.749365 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.749385 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.749407 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.749419 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:21Z","lastTransitionTime":"2026-02-25T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:21 crc kubenswrapper[4937]: E0225 15:48:21.766959 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.771366 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.771418 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.771429 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.771446 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.771456 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:21Z","lastTransitionTime":"2026-02-25T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:21 crc kubenswrapper[4937]: E0225 15:48:21.789810 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.794905 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.794977 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.795002 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.795034 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.795057 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:21Z","lastTransitionTime":"2026-02-25T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:21 crc kubenswrapper[4937]: E0225 15:48:21.809803 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.814788 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.814843 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.814856 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.814878 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.814890 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:21Z","lastTransitionTime":"2026-02-25T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:21 crc kubenswrapper[4937]: E0225 15:48:21.828724 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.833167 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.833220 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.833235 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.833258 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:21 crc kubenswrapper[4937]: I0225 15:48:21.833274 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:21Z","lastTransitionTime":"2026-02-25T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:21 crc kubenswrapper[4937]: E0225 15:48:21.852001 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:21Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:21 crc kubenswrapper[4937]: E0225 15:48:21.852266 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:48:22 crc kubenswrapper[4937]: I0225 15:48:22.367424 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:22 crc kubenswrapper[4937]: I0225 15:48:22.367462 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:22 crc kubenswrapper[4937]: I0225 15:48:22.367511 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:22 crc kubenswrapper[4937]: E0225 15:48:22.367599 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:22 crc kubenswrapper[4937]: E0225 15:48:22.367754 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:22 crc kubenswrapper[4937]: E0225 15:48:22.367839 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:22 crc kubenswrapper[4937]: E0225 15:48:22.609225 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:48:23 crc kubenswrapper[4937]: I0225 15:48:23.367449 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:23 crc kubenswrapper[4937]: E0225 15:48:23.368012 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:24 crc kubenswrapper[4937]: I0225 15:48:24.367156 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:24 crc kubenswrapper[4937]: I0225 15:48:24.367291 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:24 crc kubenswrapper[4937]: E0225 15:48:24.367427 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:24 crc kubenswrapper[4937]: E0225 15:48:24.367670 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:24 crc kubenswrapper[4937]: I0225 15:48:24.368553 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:24 crc kubenswrapper[4937]: E0225 15:48:24.368851 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:24 crc kubenswrapper[4937]: I0225 15:48:24.369075 4937 scope.go:117] "RemoveContainer" containerID="b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54" Feb 25 15:48:24 crc kubenswrapper[4937]: E0225 15:48:24.369302 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" Feb 25 15:48:25 crc kubenswrapper[4937]: I0225 15:48:25.367163 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:25 crc kubenswrapper[4937]: E0225 15:48:25.367303 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:26 crc kubenswrapper[4937]: I0225 15:48:26.367158 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:26 crc kubenswrapper[4937]: E0225 15:48:26.367389 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:26 crc kubenswrapper[4937]: I0225 15:48:26.367709 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:26 crc kubenswrapper[4937]: E0225 15:48:26.367824 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:26 crc kubenswrapper[4937]: I0225 15:48:26.367207 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:26 crc kubenswrapper[4937]: E0225 15:48:26.368973 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:26 crc kubenswrapper[4937]: I0225 15:48:26.403848 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs\") pod \"network-metrics-daemon-sz7zh\" (UID: \"f125006f-2b26-4ffe-ac0d-dc756f48b067\") " pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:26 crc kubenswrapper[4937]: E0225 15:48:26.404328 4937 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:48:26 crc kubenswrapper[4937]: E0225 15:48:26.404546 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs podName:f125006f-2b26-4ffe-ac0d-dc756f48b067 nodeName:}" failed. No retries permitted until 2026-02-25 15:48:58.404522411 +0000 UTC m=+189.417914311 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs") pod "network-metrics-daemon-sz7zh" (UID: "f125006f-2b26-4ffe-ac0d-dc756f48b067") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:48:27 crc kubenswrapper[4937]: I0225 15:48:27.366706 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:27 crc kubenswrapper[4937]: E0225 15:48:27.366901 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:27 crc kubenswrapper[4937]: E0225 15:48:27.611543 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:48:28 crc kubenswrapper[4937]: I0225 15:48:28.366968 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:28 crc kubenswrapper[4937]: I0225 15:48:28.366980 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:28 crc kubenswrapper[4937]: I0225 15:48:28.367058 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:28 crc kubenswrapper[4937]: E0225 15:48:28.367649 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:28 crc kubenswrapper[4937]: E0225 15:48:28.367720 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:28 crc kubenswrapper[4937]: E0225 15:48:28.367472 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:29 crc kubenswrapper[4937]: I0225 15:48:29.367595 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:29 crc kubenswrapper[4937]: E0225 15:48:29.367774 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:30 crc kubenswrapper[4937]: I0225 15:48:30.367649 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:30 crc kubenswrapper[4937]: I0225 15:48:30.367650 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:30 crc kubenswrapper[4937]: I0225 15:48:30.367663 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:30 crc kubenswrapper[4937]: E0225 15:48:30.368058 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:30 crc kubenswrapper[4937]: E0225 15:48:30.367856 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:30 crc kubenswrapper[4937]: E0225 15:48:30.368218 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.366936 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:31 crc kubenswrapper[4937]: E0225 15:48:31.367244 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.387689 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2967b0de-f792-4dab-bbd8-f642a206b4c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.407148 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.426566 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.444949 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.465680 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.485991 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.504634 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.534477 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:12Z\\\",\\\"message\\\":\\\"rk-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592206 7189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592263 7189 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592431 7189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592584 7189 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592655 7189 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592623 7189 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592770 7189 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592952 7189 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:48:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.553874 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.569339 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.589179 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.621745 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.641085 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.659893 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.679694 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.693927 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.713181 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.727854 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.954119 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.954469 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.954523 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.954549 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.954565 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:31Z","lastTransitionTime":"2026-02-25T15:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.962536 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:48:31 crc kubenswrapper[4937]: E0225 15:48:31.962677 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:35.962648612 +0000 UTC m=+226.976040532 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.962776 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.962865 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:31 crc kubenswrapper[4937]: E0225 15:48:31.962981 4937 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:48:31 crc kubenswrapper[4937]: E0225 15:48:31.963028 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:49:35.963015541 +0000 UTC m=+226.976407461 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:48:31 crc kubenswrapper[4937]: E0225 15:48:31.963273 4937 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:48:31 crc kubenswrapper[4937]: E0225 15:48:31.963328 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:49:35.963313378 +0000 UTC m=+226.976705308 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:48:31 crc kubenswrapper[4937]: E0225 15:48:31.975681 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.982530 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.982807 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.983001 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.983202 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:31 crc kubenswrapper[4937]: I0225 15:48:31.983391 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:31Z","lastTransitionTime":"2026-02-25T15:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.001039 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:31Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.008440 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.008524 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.008544 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.008570 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.008599 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:32Z","lastTransitionTime":"2026-02-25T15:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.027301 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.031966 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.032019 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.032037 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.032060 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.032076 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:32Z","lastTransitionTime":"2026-02-25T15:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.052327 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.056944 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.057009 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.057039 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.057084 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.057108 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:32Z","lastTransitionTime":"2026-02-25T15:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.063601 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.063657 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.063880 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.063824 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.063913 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.063926 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.063929 4937 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.063939 4937 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.064163 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 15:49:36.063980924 +0000 UTC m=+227.077372824 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.064188 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 15:49:36.064178688 +0000 UTC m=+227.077570588 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.074479 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.074645 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.201595 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlbgx_f193b13f-50ab-454a-9230-a96922b25186/kube-multus/0.log" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.201643 4937 generic.go:334] "Generic (PLEG): container finished" podID="f193b13f-50ab-454a-9230-a96922b25186" containerID="55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e" exitCode=1 Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.201676 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlbgx" event={"ID":"f193b13f-50ab-454a-9230-a96922b25186","Type":"ContainerDied","Data":"55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e"} Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.202415 4937 scope.go:117] "RemoveContainer" containerID="55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.221984 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2967b0de-f792-4dab-bbd8-f642a206b4c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.252107 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.276050 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.295804 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.315060 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.335979 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.352318 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.367200 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.367271 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.367348 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.367209 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.367591 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.367600 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.385319 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:12Z\\\",\\\"message\\\":\\\"rk-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592206 7189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592263 7189 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592431 7189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592584 7189 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592655 7189 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592623 7189 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592770 7189 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592952 7189 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:48:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.406993 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.422417 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.446410 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.460428 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.470015 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.489804 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.507588 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.520509 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"2026-02-25T15:47:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac97b035-395d-47a1-8afb-b1dd6938fec8\\\\n2026-02-25T15:47:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac97b035-395d-47a1-8afb-b1dd6938fec8 to /host/opt/cni/bin/\\\\n2026-02-25T15:47:47Z [verbose] multus-daemon started\\\\n2026-02-25T15:47:47Z [verbose] Readiness Indicator file check\\\\n2026-02-25T15:48:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.530697 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: I0225 15:48:32.538560 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:32Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:32 crc kubenswrapper[4937]: E0225 15:48:32.613396 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.207043 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlbgx_f193b13f-50ab-454a-9230-a96922b25186/kube-multus/0.log" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.207114 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlbgx" event={"ID":"f193b13f-50ab-454a-9230-a96922b25186","Type":"ContainerStarted","Data":"1d677612e23253e09a2a6bed76138c39ace5b451a67bb9fd309647de2d8b6b02"} Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.229207 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.247254 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.264079 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.297419 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:12Z\\\",\\\"message\\\":\\\"rk-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592206 7189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592263 7189 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592431 7189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592584 7189 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592655 7189 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592623 7189 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592770 7189 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592952 7189 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:48:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.319360 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.333605 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.356526 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.368423 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:33 crc kubenswrapper[4937]: E0225 15:48:33.368741 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.384654 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.400771 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.435275 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.457069 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.472923 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d677612e23253e09a2a6bed76138c39ace5b451a67bb9fd309647de2d8b6b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"2026-02-25T15:47:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac97b035-395d-47a1-8afb-b1dd6938fec8\\\\n2026-02-25T15:47:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac97b035-395d-47a1-8afb-b1dd6938fec8 to /host/opt/cni/bin/\\\\n2026-02-25T15:47:47Z [verbose] multus-daemon started\\\\n2026-02-25T15:47:47Z [verbose] Readiness Indicator file check\\\\n2026-02-25T15:48:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:48:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.486915 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.504370 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.520384 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2967b0de-f792-4dab-bbd8-f642a206b4c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.542603 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.561235 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:33 crc kubenswrapper[4937]: I0225 15:48:33.577201 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:33Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:34 crc kubenswrapper[4937]: I0225 15:48:34.366906 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:34 crc kubenswrapper[4937]: I0225 15:48:34.366933 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:34 crc kubenswrapper[4937]: I0225 15:48:34.367054 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:34 crc kubenswrapper[4937]: E0225 15:48:34.367170 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:34 crc kubenswrapper[4937]: E0225 15:48:34.367386 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:34 crc kubenswrapper[4937]: E0225 15:48:34.367506 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:35 crc kubenswrapper[4937]: I0225 15:48:35.367330 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:35 crc kubenswrapper[4937]: E0225 15:48:35.367556 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:36 crc kubenswrapper[4937]: I0225 15:48:36.366680 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:36 crc kubenswrapper[4937]: I0225 15:48:36.366740 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:36 crc kubenswrapper[4937]: I0225 15:48:36.366696 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:36 crc kubenswrapper[4937]: E0225 15:48:36.366913 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:36 crc kubenswrapper[4937]: E0225 15:48:36.367069 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:36 crc kubenswrapper[4937]: E0225 15:48:36.367248 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:37 crc kubenswrapper[4937]: I0225 15:48:37.367318 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:37 crc kubenswrapper[4937]: E0225 15:48:37.367556 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:37 crc kubenswrapper[4937]: E0225 15:48:37.614969 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:48:38 crc kubenswrapper[4937]: I0225 15:48:38.367678 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:38 crc kubenswrapper[4937]: I0225 15:48:38.367681 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:38 crc kubenswrapper[4937]: I0225 15:48:38.367681 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:38 crc kubenswrapper[4937]: E0225 15:48:38.368193 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:38 crc kubenswrapper[4937]: I0225 15:48:38.368391 4937 scope.go:117] "RemoveContainer" containerID="b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54" Feb 25 15:48:38 crc kubenswrapper[4937]: E0225 15:48:38.368467 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:38 crc kubenswrapper[4937]: E0225 15:48:38.368714 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.231606 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/2.log" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.236009 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerStarted","Data":"d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee"} Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.236651 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.250017 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.272145 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.292600 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.318429 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.334978 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.353692 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d677612e23253e09a2a6bed76138c39ace5b451a67bb9fd309647de2d8b6b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"2026-02-25T15:47:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac97b035-395d-47a1-8afb-b1dd6938fec8\\\\n2026-02-25T15:47:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac97b035-395d-47a1-8afb-b1dd6938fec8 to /host/opt/cni/bin/\\\\n2026-02-25T15:47:47Z [verbose] multus-daemon started\\\\n2026-02-25T15:47:47Z [verbose] Readiness Indicator file check\\\\n2026-02-25T15:48:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:48:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.367150 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:39 crc kubenswrapper[4937]: E0225 15:48:39.367618 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.372201 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.383273 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.387200 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.410404 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.432025 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.459070 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.481207 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.504621 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.534618 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2967b0de-f792-4dab-bbd8-f642a206b4c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.559235 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.576416 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.602284 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:12Z\\\",\\\"message\\\":\\\"rk-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592206 7189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592263 7189 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592431 7189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592584 7189 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592655 7189 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592623 7189 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592770 7189 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592952 7189 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:48:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:39 crc kubenswrapper[4937]: I0225 15:48:39.618114 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:39Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:40 crc kubenswrapper[4937]: I0225 15:48:40.366899 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:40 crc kubenswrapper[4937]: I0225 15:48:40.366912 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:40 crc kubenswrapper[4937]: E0225 15:48:40.367082 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:40 crc kubenswrapper[4937]: E0225 15:48:40.367192 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:40 crc kubenswrapper[4937]: I0225 15:48:40.366918 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:40 crc kubenswrapper[4937]: E0225 15:48:40.367293 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.245241 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/3.log" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.246153 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/2.log" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.249625 4937 generic.go:334] "Generic (PLEG): container finished" podID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerID="d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee" exitCode=1 Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.249671 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerDied","Data":"d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee"} Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.249762 4937 scope.go:117] "RemoveContainer" containerID="b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.250643 4937 scope.go:117] "RemoveContainer" containerID="d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee" Feb 25 15:48:41 crc kubenswrapper[4937]: E0225 15:48:41.250893 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.283209 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.298076 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.320481 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.338147 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.352853 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcd4d6bd-701d-4fb4-90ef-fab014bd483f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169381c1c0271e964095447cf0d45da0a846d0f98f2e5a70395b5415f6c81524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0df89d85c9e461bd8a205ce20ba9c2e6b46dd742027eaa0af058a84a6a2bd52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0df89d85c9e461bd8a205ce20ba9c2e6b46dd742027eaa0af058a84a6a2bd52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.366787 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:41 crc kubenswrapper[4937]: E0225 15:48:41.366943 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.381404 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.394021 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.410894 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d677612e23253e09a2a6bed76138c39ace5b451a67bb9fd309647de2d8b6b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"2026-02-25T15:47:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac97b035-395d-47a1-8afb-b1dd6938fec8\\\\n2026-02-25T15:47:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac97b035-395d-47a1-8afb-b1dd6938fec8 to /host/opt/cni/bin/\\\\n2026-02-25T15:47:47Z [verbose] multus-daemon started\\\\n2026-02-25T15:47:47Z [verbose] Readiness Indicator file check\\\\n2026-02-25T15:48:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:48:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.430956 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.445542 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.457665 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.473001 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2967b0de-f792-4dab-bbd8-f642a206b4c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.489443 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.506301 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.519421 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.533646 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.550885 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.565807 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.583964 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:12Z\\\",\\\"message\\\":\\\"rk-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592206 7189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592263 7189 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592431 7189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592584 7189 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592655 7189 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592623 7189 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592770 7189 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592952 7189 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:48:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:40Z\\\",\\\"message\\\":\\\"\\\\nI0225 15:48:40.302698 7629 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.302752 7629 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.303118 7629 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.303682 7629 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.303941 7629 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.304285 7629 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0225 15:48:40.304307 7629 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0225 15:48:40.304320 7629 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0225 15:48:40.304326 7629 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0225 15:48:40.304347 7629 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0225 15:48:40.304376 7629 factory.go:656] Stopping watch factory\\\\nI0225 15:48:40.304390 7629 ovnkube.go:599] Stopped ovnkube\\\\nI0225 15:48:40.304445 7629 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0225 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.602742 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.616022 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.642964 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.659090 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcd4d6bd-701d-4fb4-90ef-fab014bd483f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169381c1c0271e964095447cf0d45da0a846d0f98f2e5a70395b5415f6c81524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0df89d85c9e461bd8a205ce20ba9c2e6b46dd742027eaa0af058a84a6a2bd52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0df89d85c9e461bd8a205ce20ba9c2e6b46dd742027eaa0af058a84a6a2bd52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.692368 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.711031 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.732865 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d677612e23253e09a2a6bed76138c39ace5b451a67bb9fd309647de2d8b6b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"2026-02-25T15:47:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac97b035-395d-47a1-8afb-b1dd6938fec8\\\\n2026-02-25T15:47:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac97b035-395d-47a1-8afb-b1dd6938fec8 to /host/opt/cni/bin/\\\\n2026-02-25T15:47:47Z [verbose] multus-daemon started\\\\n2026-02-25T15:47:47Z [verbose] Readiness Indicator file check\\\\n2026-02-25T15:48:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:48:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.749158 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.760398 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.774594 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.788827 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.808396 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2967b0de-f792-4dab-bbd8-f642a206b4c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.826812 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.846159 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.865248 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.881540 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.893923 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.906415 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:41 crc kubenswrapper[4937]: I0225 15:48:41.926814 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:12Z\\\",\\\"message\\\":\\\"rk-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592206 7189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592263 7189 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592431 7189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592584 7189 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592655 7189 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592623 7189 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592770 7189 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592952 7189 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:48:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:40Z\\\",\\\"message\\\":\\\"\\\\nI0225 15:48:40.302698 7629 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.302752 7629 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.303118 7629 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.303682 7629 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.303941 7629 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.304285 7629 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0225 15:48:40.304307 7629 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0225 15:48:40.304320 7629 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0225 15:48:40.304326 7629 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0225 15:48:40.304347 7629 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0225 15:48:40.304376 7629 factory.go:656] Stopping watch factory\\\\nI0225 15:48:40.304390 7629 ovnkube.go:599] Stopped ovnkube\\\\nI0225 15:48:40.304445 7629 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0225 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:41Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.257005 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/3.log" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.367124 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.367172 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.367232 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:42 crc kubenswrapper[4937]: E0225 15:48:42.367325 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:42 crc kubenswrapper[4937]: E0225 15:48:42.367641 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:42 crc kubenswrapper[4937]: E0225 15:48:42.367950 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.468790 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.468879 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.468899 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.468924 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.468944 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:42Z","lastTransitionTime":"2026-02-25T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:42 crc kubenswrapper[4937]: E0225 15:48:42.489207 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:42Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.494542 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.494614 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.494638 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.494662 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.494678 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:42Z","lastTransitionTime":"2026-02-25T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:42 crc kubenswrapper[4937]: E0225 15:48:42.516757 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:42Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.521693 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.521735 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.521751 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.521773 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.521789 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:42Z","lastTransitionTime":"2026-02-25T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:42 crc kubenswrapper[4937]: E0225 15:48:42.538812 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:42Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.542335 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.542382 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.542398 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.542426 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.542443 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:42Z","lastTransitionTime":"2026-02-25T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:42 crc kubenswrapper[4937]: E0225 15:48:42.558575 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:42Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.562190 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.562242 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.562257 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.562278 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:42 crc kubenswrapper[4937]: I0225 15:48:42.562292 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:42Z","lastTransitionTime":"2026-02-25T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:42 crc kubenswrapper[4937]: E0225 15:48:42.582366 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:42Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:42 crc kubenswrapper[4937]: E0225 15:48:42.582680 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:48:42 crc kubenswrapper[4937]: E0225 15:48:42.616542 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:48:43 crc kubenswrapper[4937]: I0225 15:48:43.367248 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:43 crc kubenswrapper[4937]: E0225 15:48:43.367404 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:44 crc kubenswrapper[4937]: I0225 15:48:44.367461 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:44 crc kubenswrapper[4937]: I0225 15:48:44.367472 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:44 crc kubenswrapper[4937]: E0225 15:48:44.367710 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:44 crc kubenswrapper[4937]: E0225 15:48:44.367829 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:44 crc kubenswrapper[4937]: I0225 15:48:44.367929 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:44 crc kubenswrapper[4937]: E0225 15:48:44.368055 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:45 crc kubenswrapper[4937]: I0225 15:48:45.366819 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:45 crc kubenswrapper[4937]: E0225 15:48:45.367377 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:46 crc kubenswrapper[4937]: I0225 15:48:46.366776 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:46 crc kubenswrapper[4937]: I0225 15:48:46.366920 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:46 crc kubenswrapper[4937]: E0225 15:48:46.367031 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:46 crc kubenswrapper[4937]: I0225 15:48:46.366795 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:46 crc kubenswrapper[4937]: E0225 15:48:46.367207 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:46 crc kubenswrapper[4937]: E0225 15:48:46.367354 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:47 crc kubenswrapper[4937]: I0225 15:48:47.367583 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:47 crc kubenswrapper[4937]: E0225 15:48:47.367817 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:47 crc kubenswrapper[4937]: E0225 15:48:47.618184 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:48:48 crc kubenswrapper[4937]: I0225 15:48:48.366653 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:48 crc kubenswrapper[4937]: I0225 15:48:48.366662 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:48 crc kubenswrapper[4937]: E0225 15:48:48.366808 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:48 crc kubenswrapper[4937]: I0225 15:48:48.367080 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:48 crc kubenswrapper[4937]: E0225 15:48:48.367157 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:48 crc kubenswrapper[4937]: E0225 15:48:48.367219 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:49 crc kubenswrapper[4937]: I0225 15:48:49.367579 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:49 crc kubenswrapper[4937]: E0225 15:48:49.367713 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:50 crc kubenswrapper[4937]: I0225 15:48:50.366594 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:50 crc kubenswrapper[4937]: I0225 15:48:50.366686 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:50 crc kubenswrapper[4937]: E0225 15:48:50.367158 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:50 crc kubenswrapper[4937]: E0225 15:48:50.366959 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:50 crc kubenswrapper[4937]: I0225 15:48:50.366720 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:50 crc kubenswrapper[4937]: E0225 15:48:50.367299 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.367276 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:51 crc kubenswrapper[4937]: E0225 15:48:51.367920 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.382802 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.393819 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcd4d6bd-701d-4fb4-90ef-fab014bd483f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169381c1c0271e964095447cf0d45da0a846d0f98f2e5a70395b5415f6c81524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0df89d85c9e461bd8a205ce20ba9c2e6b46dd742027eaa0af058a84a6a2bd52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0df89d85c9e461bd8a205ce20ba9c2e6b46dd742027eaa0af058a84a6a2bd52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.417134 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.435440 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.455217 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d677612e23253e09a2a6bed76138c39ace5b451a67bb9fd309647de2d8b6b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"2026-02-25T15:47:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac97b035-395d-47a1-8afb-b1dd6938fec8\\\\n2026-02-25T15:47:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac97b035-395d-47a1-8afb-b1dd6938fec8 to /host/opt/cni/bin/\\\\n2026-02-25T15:47:47Z [verbose] multus-daemon started\\\\n2026-02-25T15:47:47Z [verbose] Readiness Indicator file check\\\\n2026-02-25T15:48:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:48:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.467572 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.481179 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.496326 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.514057 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2967b0de-f792-4dab-bbd8-f642a206b4c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.534624 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.555068 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.569772 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.584948 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.600255 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.611424 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.632792 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b46c3b7096fb6beaf5f259551907cc837f7f5cd59e5002e6d73c87e0e4b40d54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:12Z\\\",\\\"message\\\":\\\"rk-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592206 7189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592263 7189 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 15:48:12.592431 7189 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 15:48:12.592584 7189 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592655 7189 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592623 7189 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592770 7189 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:12.592952 7189 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:48:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:40Z\\\",\\\"message\\\":\\\"\\\\nI0225 15:48:40.302698 7629 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.302752 7629 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.303118 7629 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.303682 7629 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.303941 7629 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.304285 7629 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0225 15:48:40.304307 7629 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0225 15:48:40.304320 7629 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0225 15:48:40.304326 7629 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0225 15:48:40.304347 7629 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0225 15:48:40.304376 7629 factory.go:656] Stopping watch factory\\\\nI0225 15:48:40.304390 7629 ovnkube.go:599] Stopped ovnkube\\\\nI0225 15:48:40.304445 7629 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0225 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:48:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.645863 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.655370 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:51 crc kubenswrapper[4937]: I0225 15:48:51.667528 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:51Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.366874 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.366923 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:52 crc kubenswrapper[4937]: E0225 15:48:52.367026 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.366879 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:52 crc kubenswrapper[4937]: E0225 15:48:52.367144 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:52 crc kubenswrapper[4937]: E0225 15:48:52.367314 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.585354 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.585423 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.585447 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.585471 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.585508 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:52Z","lastTransitionTime":"2026-02-25T15:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:52 crc kubenswrapper[4937]: E0225 15:48:52.597409 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:52Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.601746 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.601798 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.601817 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.601843 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.601863 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:52Z","lastTransitionTime":"2026-02-25T15:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:52 crc kubenswrapper[4937]: E0225 15:48:52.615704 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:52Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:52 crc kubenswrapper[4937]: E0225 15:48:52.619093 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.619696 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.619725 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.619733 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.619748 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.619757 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:52Z","lastTransitionTime":"2026-02-25T15:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:52 crc kubenswrapper[4937]: E0225 15:48:52.638539 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:52Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.643262 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.643312 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.643327 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.643347 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.643363 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:52Z","lastTransitionTime":"2026-02-25T15:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:52 crc kubenswrapper[4937]: E0225 15:48:52.660816 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:52Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.669558 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.669620 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.669638 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.669662 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:48:52 crc kubenswrapper[4937]: I0225 15:48:52.669679 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:48:52Z","lastTransitionTime":"2026-02-25T15:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:48:52 crc kubenswrapper[4937]: E0225 15:48:52.684176 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"64e5ecf3-73e7-446f-b239-94e67d809649\\\",\\\"systemUUID\\\":\\\"3428e233-7684-44b8-9625-ba84b10ba5fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:52Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:52 crc kubenswrapper[4937]: E0225 15:48:52.684415 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.367145 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.368332 4937 scope.go:117] "RemoveContainer" containerID="d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee" Feb 25 15:48:53 crc kubenswrapper[4937]: E0225 15:48:53.368701 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:53 crc kubenswrapper[4937]: E0225 15:48:53.370160 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.384226 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ddf09d5-0ab8-4bb9-a321-b1a29590b29f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:47:07Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 15:47:07.123035 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 15:47:07.123291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 15:47:07.124473 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4063007363/tls.crt::/tmp/serving-cert-4063007363/tls.key\\\\\\\"\\\\nI0225 15:47:07.375384 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 15:47:07.377227 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 15:47:07.377251 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 15:47:07.377275 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 15:47:07.377283 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 15:47:07.382003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 15:47:07.382050 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 15:47:07.382076 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0225 15:47:07.382016 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0225 15:47:07.382086 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 15:47:07.382145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 15:47:07.382177 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0225 15:47:07.384003 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.393525 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vrqcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0f809a1-5ded-4908-ab90-a91c806e2302\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007161670aa7e21eac40a3874db3ac6b2b000ffa58f3ed709ab026dcc0a48925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p4wr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vrqcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.413164 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-crvn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad09c8-a744-4e42-a270-8cfee256b07f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://207f75a77665abaaadda7cd74e77a51d009e7c39c1c47e0728302aff9808ec7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8907025cb75ef8caffb18b945606b990b761338b7aa1645cc15e78e6197045fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e6ba3c6df0b6618e7fb3dd15460cb49b3005401fb706c3dde1f93c8b15014b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48d700b8ce2ada9fe0f91630d9c80c892285d7845e9c66a88f18fd04463982b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb85b7b08c629c9703ea1f6702079f920c705ba617adb55b9fa64a61ecc6df7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a24edba2a0d8717bb32e1ab7bbc80cc4d754378c9bc4a1803e2bad9aeb55e597\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ddfa82ecc9b104c742c02bbbe30962e611a0dc65fef8b51d2b5b5c9622a1628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmvp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-crvn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.429334 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f125006f-2b26-4ffe-ac0d-dc756f48b067\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs742\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sz7zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.444600 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcd4d6bd-701d-4fb4-90ef-fab014bd483f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169381c1c0271e964095447cf0d45da0a846d0f98f2e5a70395b5415f6c81524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0df89d85c9e461bd8a205ce20ba9c2e6b46dd742027eaa0af058a84a6a2bd52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0df89d85c9e461bd8a205ce20ba9c2e6b46dd742027eaa0af058a84a6a2bd52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.476237 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"854da11c-fe9a-4cea-a010-445068c2f1b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1a99b9c6d633c4a67bc307558d440e741c013388a9a3361e3e22fe482d6183d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a813da0d682c491b25e286e3fc4a294379087d541477ce90c648b66441587edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://640fe064339ecc4e5519238119ea17064113e228ffeea68d597f71a358d40205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57df5d32aa12807288dc1d6c45adc6397b964b96322df1501cbd92fb45a36f4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200faa346b1de5e7e4e5d7e15dcd9e28f7b8fb7f195a6f1e869e79d914e2c72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14cd412c84c19deb1244a15ef34ae8fc11f5ff846d64a51b488e1304392ab1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721d630ca152c85e766052bc38a0a3bb8f8dbbf5a51506bc57436cd22195ced\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e834235c96a07f885633541c98a79ecabac0b4e26c06408d239ae2d4d6c1cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.496531 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.519196 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlbgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f193b13f-50ab-454a-9230-a96922b25186\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:48:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d677612e23253e09a2a6bed76138c39ace5b451a67bb9fd309647de2d8b6b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:32Z\\\",\\\"message\\\":\\\"2026-02-25T15:47:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac97b035-395d-47a1-8afb-b1dd6938fec8\\\\n2026-02-25T15:47:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac97b035-395d-47a1-8afb-b1dd6938fec8 to /host/opt/cni/bin/\\\\n2026-02-25T15:47:47Z [verbose] multus-daemon started\\\\n2026-02-25T15:47:47Z [verbose] Readiness Indicator file check\\\\n2026-02-25T15:48:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:48:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vxd2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlbgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.533320 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f826096-fb93-42fe-a779-9afe1d36f2d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23efce504cf84eeae2b051b3a4bcc6f334998b3890f234c71fd8386b1ad4d228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8rbb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2r4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.544387 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6pm6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc7de10e-0fed-4e96-871d-e80dbc4134c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde123ce94389a2bdae9df5edfe9b924e377af78137451c51960829c64aa6b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwzjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6pm6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.559276 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c923347-0d7f-4647-a3d3-1a0e5e68daaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b45ca472b845d8833b0ed47b824e3e19eef871f2123015dceff8babd56c969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de4279acb55b31ad7867f0d2844ebb53bfa8e588618e96d6dcd67ee16d0107c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjdn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mjkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.575218 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2967b0de-f792-4dab-bbd8-f642a206b4c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa50431aec215810d5df09bffc7a8ee235cb9b5fa1348a40af1b1d2509cac1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8faa39fd670a093db40f30a9828d7c1639ff813d7d73b26644c0235a17066ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://297794fce6ed3384fe327c3e232bd71b62e176895c836f050efec230bc468ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4189b6f91fa3f07a4d37d7d2c2ca80b26b829bd21d2b31a0a7b63174f05a9cc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.595293 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.612588 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://356986d72ba241bf45ca268f787d01138ecf97c706eaacdf90b08b46fd633243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.629243 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.645746 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ffd64c0-1eee-44be-ac62-f6d4a1345e9a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15cd0873553b7eaddbc5687e9ac2cd3d2f1795b62b95f222ae6c1ebbe7825e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c55092c0af2606d3556da851963c16d34dca6b7dd96e546b3d0f313e54f9f75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T15:46:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 15:45:57.399872 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 15:45:57.409956 1 observer_polling.go:159] Starting file observer\\\\nI0225 15:45:57.595257 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 15:45:57.599065 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 15:46:27.751641 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a64e1a199621535874072ea873e7c304a1535e76e5120b01a73272eaa2a2ffb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35b893e99af6b4a3e47a7e7a198473cce70c67f787affec64f2de7abbcd876c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.660082 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b02448442a16428a75519fe9fdf5be90e5cf46a64179f01548466062be166679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.680258 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31960ddb77b134fe6eb7dcee8662f8cc09bf55a1540edbc7e99ff0da8be890cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:53 crc kubenswrapper[4937]: I0225 15:48:53.705302 4937 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89a5d3cb-d884-4e27-90df-972e98830bcb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T15:47:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T15:48:40Z\\\",\\\"message\\\":\\\"\\\\nI0225 15:48:40.302698 7629 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.302752 7629 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.303118 7629 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.303682 7629 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.303941 7629 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 15:48:40.304285 7629 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0225 15:48:40.304307 7629 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0225 15:48:40.304320 7629 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0225 15:48:40.304326 7629 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0225 15:48:40.304347 7629 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0225 15:48:40.304376 7629 factory.go:656] Stopping watch factory\\\\nI0225 15:48:40.304390 7629 ovnkube.go:599] Stopped ovnkube\\\\nI0225 15:48:40.304445 7629 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0225 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T15:48:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2px4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T15:47:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cl2zn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T15:48:53Z is after 2025-08-24T17:21:41Z" Feb 25 15:48:54 crc kubenswrapper[4937]: I0225 15:48:54.367518 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:54 crc kubenswrapper[4937]: I0225 15:48:54.367620 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:54 crc kubenswrapper[4937]: I0225 15:48:54.367672 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:54 crc kubenswrapper[4937]: E0225 15:48:54.367749 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:54 crc kubenswrapper[4937]: E0225 15:48:54.368165 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:54 crc kubenswrapper[4937]: E0225 15:48:54.368421 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:55 crc kubenswrapper[4937]: I0225 15:48:55.366987 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:55 crc kubenswrapper[4937]: E0225 15:48:55.367171 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:56 crc kubenswrapper[4937]: I0225 15:48:56.366741 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:56 crc kubenswrapper[4937]: I0225 15:48:56.366832 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:56 crc kubenswrapper[4937]: I0225 15:48:56.366876 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:56 crc kubenswrapper[4937]: E0225 15:48:56.366905 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:56 crc kubenswrapper[4937]: E0225 15:48:56.367070 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:56 crc kubenswrapper[4937]: E0225 15:48:56.367164 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:57 crc kubenswrapper[4937]: I0225 15:48:57.366757 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:57 crc kubenswrapper[4937]: E0225 15:48:57.366951 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:48:57 crc kubenswrapper[4937]: E0225 15:48:57.620828 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:48:58 crc kubenswrapper[4937]: I0225 15:48:58.366730 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:48:58 crc kubenswrapper[4937]: I0225 15:48:58.366797 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:58 crc kubenswrapper[4937]: E0225 15:48:58.366855 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:48:58 crc kubenswrapper[4937]: I0225 15:48:58.366920 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:48:58 crc kubenswrapper[4937]: E0225 15:48:58.366988 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:48:58 crc kubenswrapper[4937]: E0225 15:48:58.367147 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:48:58 crc kubenswrapper[4937]: I0225 15:48:58.485772 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs\") pod \"network-metrics-daemon-sz7zh\" (UID: \"f125006f-2b26-4ffe-ac0d-dc756f48b067\") " pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:48:58 crc kubenswrapper[4937]: E0225 15:48:58.485956 4937 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:48:58 crc kubenswrapper[4937]: E0225 15:48:58.486089 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs podName:f125006f-2b26-4ffe-ac0d-dc756f48b067 nodeName:}" failed. No retries permitted until 2026-02-25 15:50:02.486062837 +0000 UTC m=+253.499454757 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs") pod "network-metrics-daemon-sz7zh" (UID: "f125006f-2b26-4ffe-ac0d-dc756f48b067") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 15:48:59 crc kubenswrapper[4937]: I0225 15:48:59.366683 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:48:59 crc kubenswrapper[4937]: E0225 15:48:59.366868 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:00 crc kubenswrapper[4937]: I0225 15:49:00.367298 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:00 crc kubenswrapper[4937]: I0225 15:49:00.367336 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:00 crc kubenswrapper[4937]: I0225 15:49:00.367389 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:00 crc kubenswrapper[4937]: E0225 15:49:00.367475 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:00 crc kubenswrapper[4937]: E0225 15:49:00.367577 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:00 crc kubenswrapper[4937]: E0225 15:49:00.367671 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:01 crc kubenswrapper[4937]: I0225 15:49:01.367477 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:01 crc kubenswrapper[4937]: E0225 15:49:01.367682 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:01 crc kubenswrapper[4937]: I0225 15:49:01.398835 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=22.398812035 podStartE2EDuration="22.398812035s" podCreationTimestamp="2026-02-25 15:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:01.398439176 +0000 UTC m=+192.411831086" watchObservedRunningTime="2026-02-25 15:49:01.398812035 +0000 UTC m=+192.412203925" Feb 25 15:49:01 crc kubenswrapper[4937]: I0225 15:49:01.426032 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=82.426009033 podStartE2EDuration="1m22.426009033s" podCreationTimestamp="2026-02-25 15:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:01.425962722 +0000 UTC m=+192.439354642" watchObservedRunningTime="2026-02-25 15:49:01.426009033 +0000 UTC m=+192.439400923" Feb 25 15:49:01 crc kubenswrapper[4937]: I0225 15:49:01.477764 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podStartSLOduration=140.477740575 podStartE2EDuration="2m20.477740575s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:01.477170731 +0000 UTC m=+192.490562641" watchObservedRunningTime="2026-02-25 15:49:01.477740575 +0000 UTC m=+192.491132475" Feb 25 15:49:01 crc kubenswrapper[4937]: I0225 15:49:01.478024 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dlbgx" podStartSLOduration=140.478016231 podStartE2EDuration="2m20.478016231s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:01.462803103 +0000 UTC m=+192.476195003" watchObservedRunningTime="2026-02-25 15:49:01.478016231 +0000 UTC m=+192.491408151" Feb 25 15:49:01 crc kubenswrapper[4937]: I0225 15:49:01.487885 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6pm6h" podStartSLOduration=140.48786313 podStartE2EDuration="2m20.48786313s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:01.487338247 +0000 UTC m=+192.500730157" watchObservedRunningTime="2026-02-25 15:49:01.48786313 +0000 UTC m=+192.501255030" Feb 25 15:49:01 crc kubenswrapper[4937]: I0225 15:49:01.520461 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=58.520445448 podStartE2EDuration="58.520445448s" podCreationTimestamp="2026-02-25 15:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:01.52014599 +0000 UTC m=+192.533537890" watchObservedRunningTime="2026-02-25 15:49:01.520445448 +0000 UTC m=+192.533837338" Feb 25 15:49:01 crc kubenswrapper[4937]: I0225 15:49:01.520686 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mjkg" podStartSLOduration=139.520680293 podStartE2EDuration="2m19.520680293s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:01.504571234 +0000 UTC m=+192.517963174" watchObservedRunningTime="2026-02-25 15:49:01.520680293 +0000 UTC m=+192.534072183" Feb 25 15:49:01 crc kubenswrapper[4937]: I0225 15:49:01.582131 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.582112359 podStartE2EDuration="1m7.582112359s" podCreationTimestamp="2026-02-25 15:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:01.581024033 +0000 UTC m=+192.594415923" watchObservedRunningTime="2026-02-25 15:49:01.582112359 +0000 UTC m=+192.595504249" Feb 25 15:49:01 crc kubenswrapper[4937]: I0225 15:49:01.655948 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=87.655927525 podStartE2EDuration="1m27.655927525s" podCreationTimestamp="2026-02-25 15:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:01.655830643 +0000 UTC m=+192.669222543" watchObservedRunningTime="2026-02-25 15:49:01.655927525 +0000 UTC m=+192.669319405" Feb 25 15:49:01 crc kubenswrapper[4937]: I0225 15:49:01.667410 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vrqcw" podStartSLOduration=140.667390072 podStartE2EDuration="2m20.667390072s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:01.665857745 +0000 UTC m=+192.679249635" watchObservedRunningTime="2026-02-25 15:49:01.667390072 +0000 UTC m=+192.680781962" Feb 25 15:49:01 crc kubenswrapper[4937]: I0225 15:49:01.686622 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-crvn5" podStartSLOduration=140.686596247 podStartE2EDuration="2m20.686596247s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:01.684615829 +0000 UTC m=+192.698007739" watchObservedRunningTime="2026-02-25 15:49:01.686596247 +0000 UTC m=+192.699988157" Feb 25 15:49:02 crc kubenswrapper[4937]: I0225 15:49:02.367205 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:02 crc kubenswrapper[4937]: I0225 15:49:02.367258 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:02 crc kubenswrapper[4937]: I0225 15:49:02.367263 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:02 crc kubenswrapper[4937]: E0225 15:49:02.367420 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:02 crc kubenswrapper[4937]: E0225 15:49:02.367616 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:02 crc kubenswrapper[4937]: E0225 15:49:02.367797 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:02 crc kubenswrapper[4937]: E0225 15:49:02.621454 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:49:02 crc kubenswrapper[4937]: I0225 15:49:02.907120 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 15:49:02 crc kubenswrapper[4937]: I0225 15:49:02.907160 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 15:49:02 crc kubenswrapper[4937]: I0225 15:49:02.907172 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 15:49:02 crc kubenswrapper[4937]: I0225 15:49:02.907189 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 15:49:02 crc kubenswrapper[4937]: I0225 15:49:02.907200 4937 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T15:49:02Z","lastTransitionTime":"2026-02-25T15:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 15:49:02 crc kubenswrapper[4937]: I0225 15:49:02.967863 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz"] Feb 25 15:49:02 crc kubenswrapper[4937]: I0225 15:49:02.968583 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:02 crc kubenswrapper[4937]: I0225 15:49:02.974616 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 25 15:49:02 crc kubenswrapper[4937]: I0225 15:49:02.975911 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 25 15:49:02 crc kubenswrapper[4937]: I0225 15:49:02.976027 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 25 15:49:02 crc kubenswrapper[4937]: I0225 15:49:02.977047 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.033801 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.033855 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.033876 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.033892 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.033908 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.135205 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.135287 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.135320 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.135355 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.135562 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.135466 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.135667 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.136624 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.143231 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.166572 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-46jjz\" (UID: \"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.294616 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.339231 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" event={"ID":"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7","Type":"ContainerStarted","Data":"c1995860533933c916f55518a8b4c99d3f16c544af5cfa51759ae1157289902d"} Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.366837 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:03 crc kubenswrapper[4937]: E0225 15:49:03.367073 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.384383 4937 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 25 15:49:03 crc kubenswrapper[4937]: I0225 15:49:03.395344 4937 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 25 15:49:04 crc kubenswrapper[4937]: I0225 15:49:04.343525 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" event={"ID":"3e9f03d2-2448-4fdc-aee2-aa9aa0d2f6b7","Type":"ContainerStarted","Data":"991819f9bc49341f600d37fa8a00dd4228406efd8ad35565951b3d57f6274985"} Feb 25 15:49:04 crc kubenswrapper[4937]: I0225 15:49:04.367335 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:04 crc kubenswrapper[4937]: I0225 15:49:04.367383 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:04 crc kubenswrapper[4937]: I0225 15:49:04.367392 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:04 crc kubenswrapper[4937]: E0225 15:49:04.367471 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:04 crc kubenswrapper[4937]: E0225 15:49:04.367599 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:04 crc kubenswrapper[4937]: E0225 15:49:04.367702 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:05 crc kubenswrapper[4937]: I0225 15:49:05.367177 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:05 crc kubenswrapper[4937]: E0225 15:49:05.367365 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:06 crc kubenswrapper[4937]: I0225 15:49:06.366779 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:06 crc kubenswrapper[4937]: I0225 15:49:06.366904 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:06 crc kubenswrapper[4937]: I0225 15:49:06.367074 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:06 crc kubenswrapper[4937]: E0225 15:49:06.367247 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:06 crc kubenswrapper[4937]: E0225 15:49:06.367333 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:06 crc kubenswrapper[4937]: E0225 15:49:06.367830 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:06 crc kubenswrapper[4937]: I0225 15:49:06.368339 4937 scope.go:117] "RemoveContainer" containerID="d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee" Feb 25 15:49:06 crc kubenswrapper[4937]: E0225 15:49:06.368607 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" Feb 25 15:49:07 crc kubenswrapper[4937]: I0225 15:49:07.367327 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:07 crc kubenswrapper[4937]: E0225 15:49:07.367541 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:07 crc kubenswrapper[4937]: E0225 15:49:07.622979 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:49:08 crc kubenswrapper[4937]: I0225 15:49:08.366549 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:08 crc kubenswrapper[4937]: I0225 15:49:08.366593 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:08 crc kubenswrapper[4937]: E0225 15:49:08.366670 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:08 crc kubenswrapper[4937]: I0225 15:49:08.366608 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:08 crc kubenswrapper[4937]: E0225 15:49:08.366735 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:08 crc kubenswrapper[4937]: E0225 15:49:08.366810 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:09 crc kubenswrapper[4937]: I0225 15:49:09.367536 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:09 crc kubenswrapper[4937]: E0225 15:49:09.367687 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:10 crc kubenswrapper[4937]: I0225 15:49:10.366832 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:10 crc kubenswrapper[4937]: E0225 15:49:10.367001 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:10 crc kubenswrapper[4937]: I0225 15:49:10.366840 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:10 crc kubenswrapper[4937]: E0225 15:49:10.367535 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:10 crc kubenswrapper[4937]: I0225 15:49:10.367566 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:10 crc kubenswrapper[4937]: E0225 15:49:10.367703 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:11 crc kubenswrapper[4937]: I0225 15:49:11.366778 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:11 crc kubenswrapper[4937]: E0225 15:49:11.377738 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:12 crc kubenswrapper[4937]: I0225 15:49:12.367108 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:12 crc kubenswrapper[4937]: I0225 15:49:12.367203 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:12 crc kubenswrapper[4937]: E0225 15:49:12.367298 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:12 crc kubenswrapper[4937]: I0225 15:49:12.367325 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:12 crc kubenswrapper[4937]: E0225 15:49:12.367448 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:12 crc kubenswrapper[4937]: E0225 15:49:12.367837 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:12 crc kubenswrapper[4937]: E0225 15:49:12.625762 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:49:13 crc kubenswrapper[4937]: I0225 15:49:13.366851 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:13 crc kubenswrapper[4937]: E0225 15:49:13.367036 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:14 crc kubenswrapper[4937]: I0225 15:49:14.367398 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:14 crc kubenswrapper[4937]: E0225 15:49:14.367652 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:14 crc kubenswrapper[4937]: I0225 15:49:14.367423 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:14 crc kubenswrapper[4937]: I0225 15:49:14.367423 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:14 crc kubenswrapper[4937]: E0225 15:49:14.367759 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:14 crc kubenswrapper[4937]: E0225 15:49:14.367929 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:15 crc kubenswrapper[4937]: I0225 15:49:15.367626 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:15 crc kubenswrapper[4937]: E0225 15:49:15.369059 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:16 crc kubenswrapper[4937]: I0225 15:49:16.367159 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:16 crc kubenswrapper[4937]: I0225 15:49:16.367277 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:16 crc kubenswrapper[4937]: E0225 15:49:16.367375 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:16 crc kubenswrapper[4937]: E0225 15:49:16.367525 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:16 crc kubenswrapper[4937]: I0225 15:49:16.367716 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:16 crc kubenswrapper[4937]: E0225 15:49:16.367974 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:17 crc kubenswrapper[4937]: I0225 15:49:17.366820 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:17 crc kubenswrapper[4937]: E0225 15:49:17.367019 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:17 crc kubenswrapper[4937]: E0225 15:49:17.627852 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:49:18 crc kubenswrapper[4937]: I0225 15:49:18.367137 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:18 crc kubenswrapper[4937]: I0225 15:49:18.367217 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:18 crc kubenswrapper[4937]: E0225 15:49:18.367883 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:18 crc kubenswrapper[4937]: E0225 15:49:18.367944 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:18 crc kubenswrapper[4937]: I0225 15:49:18.367217 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:18 crc kubenswrapper[4937]: E0225 15:49:18.368082 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:18 crc kubenswrapper[4937]: I0225 15:49:18.390505 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlbgx_f193b13f-50ab-454a-9230-a96922b25186/kube-multus/1.log" Feb 25 15:49:18 crc kubenswrapper[4937]: I0225 15:49:18.391114 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlbgx_f193b13f-50ab-454a-9230-a96922b25186/kube-multus/0.log" Feb 25 15:49:18 crc kubenswrapper[4937]: I0225 15:49:18.391198 4937 generic.go:334] "Generic (PLEG): container finished" podID="f193b13f-50ab-454a-9230-a96922b25186" containerID="1d677612e23253e09a2a6bed76138c39ace5b451a67bb9fd309647de2d8b6b02" exitCode=1 Feb 25 15:49:18 crc kubenswrapper[4937]: I0225 15:49:18.391239 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlbgx" event={"ID":"f193b13f-50ab-454a-9230-a96922b25186","Type":"ContainerDied","Data":"1d677612e23253e09a2a6bed76138c39ace5b451a67bb9fd309647de2d8b6b02"} Feb 25 15:49:18 crc kubenswrapper[4937]: I0225 15:49:18.391281 4937 scope.go:117] "RemoveContainer" containerID="55f27c207d253be154dc2cac2171dfed82ba8c3972f3656d78bc138e14db684e" Feb 25 15:49:18 crc kubenswrapper[4937]: I0225 15:49:18.391859 4937 scope.go:117] "RemoveContainer" containerID="1d677612e23253e09a2a6bed76138c39ace5b451a67bb9fd309647de2d8b6b02" Feb 25 15:49:18 crc kubenswrapper[4937]: E0225 15:49:18.392097 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-dlbgx_openshift-multus(f193b13f-50ab-454a-9230-a96922b25186)\"" pod="openshift-multus/multus-dlbgx" podUID="f193b13f-50ab-454a-9230-a96922b25186" Feb 25 15:49:18 crc kubenswrapper[4937]: I0225 15:49:18.418853 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-46jjz" podStartSLOduration=157.418831414 podStartE2EDuration="2m37.418831414s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:04.361187582 +0000 UTC m=+195.374579482" watchObservedRunningTime="2026-02-25 15:49:18.418831414 +0000 UTC m=+209.432223334" Feb 25 15:49:19 crc kubenswrapper[4937]: I0225 15:49:19.367111 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:19 crc kubenswrapper[4937]: E0225 15:49:19.367286 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:19 crc kubenswrapper[4937]: I0225 15:49:19.396958 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlbgx_f193b13f-50ab-454a-9230-a96922b25186/kube-multus/1.log" Feb 25 15:49:20 crc kubenswrapper[4937]: I0225 15:49:20.367461 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:20 crc kubenswrapper[4937]: I0225 15:49:20.367546 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:20 crc kubenswrapper[4937]: I0225 15:49:20.367664 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:20 crc kubenswrapper[4937]: E0225 15:49:20.367653 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:20 crc kubenswrapper[4937]: E0225 15:49:20.367904 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:20 crc kubenswrapper[4937]: E0225 15:49:20.367995 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:20 crc kubenswrapper[4937]: I0225 15:49:20.369295 4937 scope.go:117] "RemoveContainer" containerID="d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee" Feb 25 15:49:20 crc kubenswrapper[4937]: E0225 15:49:20.369628 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cl2zn_openshift-ovn-kubernetes(89a5d3cb-d884-4e27-90df-972e98830bcb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" Feb 25 15:49:21 crc kubenswrapper[4937]: I0225 15:49:21.367458 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:21 crc kubenswrapper[4937]: E0225 15:49:21.369669 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:22 crc kubenswrapper[4937]: I0225 15:49:22.367101 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:22 crc kubenswrapper[4937]: I0225 15:49:22.367101 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:22 crc kubenswrapper[4937]: I0225 15:49:22.367345 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:22 crc kubenswrapper[4937]: E0225 15:49:22.367910 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:22 crc kubenswrapper[4937]: E0225 15:49:22.368117 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:22 crc kubenswrapper[4937]: E0225 15:49:22.368291 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:22 crc kubenswrapper[4937]: E0225 15:49:22.629251 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:49:23 crc kubenswrapper[4937]: I0225 15:49:23.366721 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:23 crc kubenswrapper[4937]: E0225 15:49:23.366987 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:24 crc kubenswrapper[4937]: I0225 15:49:24.367267 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:24 crc kubenswrapper[4937]: I0225 15:49:24.367350 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:24 crc kubenswrapper[4937]: I0225 15:49:24.367430 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:24 crc kubenswrapper[4937]: E0225 15:49:24.367599 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:24 crc kubenswrapper[4937]: E0225 15:49:24.367725 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:24 crc kubenswrapper[4937]: E0225 15:49:24.367885 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:25 crc kubenswrapper[4937]: I0225 15:49:25.366993 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:25 crc kubenswrapper[4937]: E0225 15:49:25.367153 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:26 crc kubenswrapper[4937]: I0225 15:49:26.367704 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:26 crc kubenswrapper[4937]: I0225 15:49:26.367714 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:26 crc kubenswrapper[4937]: E0225 15:49:26.367928 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:26 crc kubenswrapper[4937]: I0225 15:49:26.367718 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:26 crc kubenswrapper[4937]: E0225 15:49:26.368075 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:26 crc kubenswrapper[4937]: E0225 15:49:26.368149 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:27 crc kubenswrapper[4937]: I0225 15:49:27.367642 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:27 crc kubenswrapper[4937]: E0225 15:49:27.367879 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:27 crc kubenswrapper[4937]: E0225 15:49:27.630608 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:49:28 crc kubenswrapper[4937]: I0225 15:49:28.367067 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:28 crc kubenswrapper[4937]: I0225 15:49:28.367140 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:28 crc kubenswrapper[4937]: I0225 15:49:28.367205 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:28 crc kubenswrapper[4937]: E0225 15:49:28.367208 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:28 crc kubenswrapper[4937]: E0225 15:49:28.367319 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:28 crc kubenswrapper[4937]: E0225 15:49:28.367418 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:29 crc kubenswrapper[4937]: I0225 15:49:29.367617 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:29 crc kubenswrapper[4937]: E0225 15:49:29.367963 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:30 crc kubenswrapper[4937]: I0225 15:49:30.366895 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:30 crc kubenswrapper[4937]: I0225 15:49:30.366970 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:30 crc kubenswrapper[4937]: I0225 15:49:30.366923 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:30 crc kubenswrapper[4937]: E0225 15:49:30.367151 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:30 crc kubenswrapper[4937]: E0225 15:49:30.367388 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:30 crc kubenswrapper[4937]: E0225 15:49:30.367594 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:31 crc kubenswrapper[4937]: I0225 15:49:31.367248 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:31 crc kubenswrapper[4937]: E0225 15:49:31.369183 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:32 crc kubenswrapper[4937]: I0225 15:49:32.367015 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:32 crc kubenswrapper[4937]: I0225 15:49:32.367069 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:32 crc kubenswrapper[4937]: I0225 15:49:32.367195 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:32 crc kubenswrapper[4937]: E0225 15:49:32.367295 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:32 crc kubenswrapper[4937]: E0225 15:49:32.367434 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:32 crc kubenswrapper[4937]: E0225 15:49:32.367518 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:32 crc kubenswrapper[4937]: E0225 15:49:32.632036 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:49:33 crc kubenswrapper[4937]: I0225 15:49:33.366994 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:33 crc kubenswrapper[4937]: E0225 15:49:33.367181 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:33 crc kubenswrapper[4937]: I0225 15:49:33.367725 4937 scope.go:117] "RemoveContainer" containerID="1d677612e23253e09a2a6bed76138c39ace5b451a67bb9fd309647de2d8b6b02" Feb 25 15:49:33 crc kubenswrapper[4937]: I0225 15:49:33.368424 4937 scope.go:117] "RemoveContainer" containerID="d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee" Feb 25 15:49:34 crc kubenswrapper[4937]: I0225 15:49:34.366944 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:34 crc kubenswrapper[4937]: I0225 15:49:34.366976 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:34 crc kubenswrapper[4937]: E0225 15:49:34.367692 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:34 crc kubenswrapper[4937]: I0225 15:49:34.367054 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:34 crc kubenswrapper[4937]: E0225 15:49:34.367928 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:34 crc kubenswrapper[4937]: E0225 15:49:34.368068 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:34 crc kubenswrapper[4937]: I0225 15:49:34.453384 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/3.log" Feb 25 15:49:34 crc kubenswrapper[4937]: I0225 15:49:34.457151 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerStarted","Data":"e37144d901d75f45a8471889632d291826e668961da7dfc2db6033335001e105"} Feb 25 15:49:34 crc kubenswrapper[4937]: I0225 15:49:34.457579 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:49:34 crc kubenswrapper[4937]: I0225 15:49:34.459517 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlbgx_f193b13f-50ab-454a-9230-a96922b25186/kube-multus/1.log" Feb 25 15:49:34 crc kubenswrapper[4937]: I0225 15:49:34.459557 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlbgx" event={"ID":"f193b13f-50ab-454a-9230-a96922b25186","Type":"ContainerStarted","Data":"0451ad74afcc7887e90ed45f51efddafe48efb2249762c9aa5e2da84d9691199"} Feb 25 15:49:34 crc kubenswrapper[4937]: I0225 15:49:34.482067 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podStartSLOduration=173.482045649 podStartE2EDuration="2m53.482045649s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:34.481027884 +0000 UTC m=+225.494419774" watchObservedRunningTime="2026-02-25 15:49:34.482045649 +0000 UTC m=+225.495437539" Feb 25 15:49:34 crc kubenswrapper[4937]: I0225 15:49:34.648202 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sz7zh"] Feb 25 15:49:34 crc kubenswrapper[4937]: I0225 15:49:34.648344 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:34 crc kubenswrapper[4937]: E0225 15:49:34.648505 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:35 crc kubenswrapper[4937]: I0225 15:49:35.367645 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:35 crc kubenswrapper[4937]: E0225 15:49:35.367923 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:36 crc kubenswrapper[4937]: I0225 15:49:36.037234 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.037669 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:51:38.037605303 +0000 UTC m=+349.050997233 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:36 crc kubenswrapper[4937]: I0225 15:49:36.037783 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:36 crc kubenswrapper[4937]: I0225 15:49:36.038012 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.038059 4937 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.038179 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:51:38.038150766 +0000 UTC m=+349.051542686 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.038203 4937 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.038311 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 15:51:38.038287239 +0000 UTC m=+349.051679129 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 15:49:36 crc kubenswrapper[4937]: I0225 15:49:36.139795 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:36 crc kubenswrapper[4937]: I0225 15:49:36.139881 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.140088 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.140154 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.140113 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.140198 4937 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.140216 4937 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.140171 4937 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.140293 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 15:51:38.140269377 +0000 UTC m=+349.153661267 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.140342 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 15:51:38.140317658 +0000 UTC m=+349.153709728 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 15:49:36 crc kubenswrapper[4937]: I0225 15:49:36.366958 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.367091 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:36 crc kubenswrapper[4937]: I0225 15:49:36.366974 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.367163 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:36 crc kubenswrapper[4937]: I0225 15:49:36.367228 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:36 crc kubenswrapper[4937]: E0225 15:49:36.367455 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:37 crc kubenswrapper[4937]: I0225 15:49:37.367157 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:37 crc kubenswrapper[4937]: E0225 15:49:37.367316 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:37 crc kubenswrapper[4937]: E0225 15:49:37.633260 4937 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 15:49:38 crc kubenswrapper[4937]: I0225 15:49:38.367677 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:38 crc kubenswrapper[4937]: I0225 15:49:38.367780 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:38 crc kubenswrapper[4937]: I0225 15:49:38.367706 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:38 crc kubenswrapper[4937]: E0225 15:49:38.367921 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:38 crc kubenswrapper[4937]: E0225 15:49:38.368060 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:38 crc kubenswrapper[4937]: E0225 15:49:38.368230 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:39 crc kubenswrapper[4937]: I0225 15:49:39.367693 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:39 crc kubenswrapper[4937]: E0225 15:49:39.367871 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:40 crc kubenswrapper[4937]: I0225 15:49:40.366958 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:40 crc kubenswrapper[4937]: E0225 15:49:40.367085 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:40 crc kubenswrapper[4937]: I0225 15:49:40.367184 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:40 crc kubenswrapper[4937]: E0225 15:49:40.367383 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:40 crc kubenswrapper[4937]: I0225 15:49:40.367524 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:40 crc kubenswrapper[4937]: E0225 15:49:40.367736 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:41 crc kubenswrapper[4937]: I0225 15:49:41.366951 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:41 crc kubenswrapper[4937]: E0225 15:49:41.369925 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 15:49:41 crc kubenswrapper[4937]: I0225 15:49:41.856880 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 15:49:42 crc kubenswrapper[4937]: I0225 15:49:42.367744 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:42 crc kubenswrapper[4937]: I0225 15:49:42.367781 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:42 crc kubenswrapper[4937]: I0225 15:49:42.367873 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:42 crc kubenswrapper[4937]: E0225 15:49:42.367957 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:49:42 crc kubenswrapper[4937]: E0225 15:49:42.368165 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sz7zh" podUID="f125006f-2b26-4ffe-ac0d-dc756f48b067" Feb 25 15:49:42 crc kubenswrapper[4937]: E0225 15:49:42.368270 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.366968 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.370569 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.373238 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.735439 4937 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.778063 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-29fxd"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.778689 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.784583 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.785157 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.785992 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qfghw"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.786871 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.786881 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.786941 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.786979 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qfghw" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.786891 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.787089 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wlfqx"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.787215 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.787759 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.789867 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8zn9j"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.790310 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-znkpp"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.793191 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.793538 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.793996 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.794384 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.795250 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zv69t"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.794406 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.795830 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.796166 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.796204 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.796306 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.796782 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.797166 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.797217 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-vd8vf"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.797660 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.798132 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vd8vf" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.806037 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.806047 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6r6g7"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.806363 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.806423 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.806723 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.806996 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.807683 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.812904 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.813342 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.813605 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.813616 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.813954 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.813656 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.814796 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-djs85"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.818523 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.819172 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.821786 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.821819 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.822119 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.822140 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.822384 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.822689 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.822717 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.822988 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.829146 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.829741 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.830034 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.830281 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.830576 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.830753 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.830996 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.831628 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.830787 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.832416 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.832607 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.833001 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.833050 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.835634 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.840893 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.841014 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.841233 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.841254 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.841395 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.841539 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.841402 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.840992 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.840913 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.859870 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.860102 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.860336 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.860460 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.860606 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.860740 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.860860 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.861016 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.860616 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.864645 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.864835 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.864969 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.864963 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7drd4"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.866029 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.867738 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.871235 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.872952 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.873009 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.873157 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.873256 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rcxdq"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.873347 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.873456 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.873640 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.873830 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.873979 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.874049 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.874472 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.877836 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.882475 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.883346 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.885561 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.886414 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.886810 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.887261 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.887794 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.888133 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.890010 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.890639 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.890660 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.895874 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.896854 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.897068 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.897210 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.897390 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.898570 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.898715 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.898878 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.899005 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.899149 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.899327 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.900012 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.900174 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.900323 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.900398 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.900855 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.901029 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.901229 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.906281 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.934828 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.945395 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.946081 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.948286 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.949570 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-oauth-config\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.949638 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b771f4d8-8253-4530-9e1a-e0ca06f263e4-auth-proxy-config\") pod \"machine-approver-56656f9798-sbntq\" (UID: \"b771f4d8-8253-4530-9e1a-e0ca06f263e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.949670 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-786cs\" (UniqueName: \"kubernetes.io/projected/ff089f24-3d05-4c97-b6f7-3a39cbec049f-kube-api-access-786cs\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.949694 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpjv8\" (UniqueName: \"kubernetes.io/projected/d33e6a6a-98b5-4eb8-8de5-8138395b48cb-kube-api-access-hpjv8\") pod \"dns-operator-744455d44c-qfghw\" (UID: \"d33e6a6a-98b5-4eb8-8de5-8138395b48cb\") " pod="openshift-dns-operator/dns-operator-744455d44c-qfghw" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.949720 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/430d304c-8623-4d01-a878-5db061d6a5b8-serving-cert\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.949738 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.949771 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p57qc\" (UniqueName: \"kubernetes.io/projected/d9c49432-4c74-4842-bdd2-880414a4ad0a-kube-api-access-p57qc\") pod \"downloads-7954f5f757-vd8vf\" (UID: \"d9c49432-4c74-4842-bdd2-880414a4ad0a\") " pod="openshift-console/downloads-7954f5f757-vd8vf" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.949795 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm5cj\" (UniqueName: \"kubernetes.io/projected/2bfcb195-48c8-46cd-b417-aacb40f615f4-kube-api-access-cm5cj\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.949820 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06ec1775-ce0a-4a78-b4ea-75de7a931917-trusted-ca\") pod \"console-operator-58897d9998-znkpp\" (UID: \"06ec1775-ce0a-4a78-b4ea-75de7a931917\") " pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.949840 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7210df16-765e-4b49-8b67-8989f4b2f15c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8zn9j\" (UID: \"7210df16-765e-4b49-8b67-8989f4b2f15c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.949861 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d33e6a6a-98b5-4eb8-8de5-8138395b48cb-metrics-tls\") pod \"dns-operator-744455d44c-qfghw\" (UID: \"d33e6a6a-98b5-4eb8-8de5-8138395b48cb\") " pod="openshift-dns-operator/dns-operator-744455d44c-qfghw" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.949884 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc52d338-32a0-4072-8b02-578a41f8b3bc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4jjbg\" (UID: \"dc52d338-32a0-4072-8b02-578a41f8b3bc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.949911 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92721dbb-2c2a-448a-801f-579a9d2d9566-serving-cert\") pod \"openshift-config-operator-7777fb866f-zv69t\" (UID: \"92721dbb-2c2a-448a-801f-579a9d2d9566\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950060 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dff4k\" (UniqueName: \"kubernetes.io/projected/92721dbb-2c2a-448a-801f-579a9d2d9566-kube-api-access-dff4k\") pod \"openshift-config-operator-7777fb866f-zv69t\" (UID: \"92721dbb-2c2a-448a-801f-579a9d2d9566\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950100 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s24f\" (UniqueName: \"kubernetes.io/projected/430d304c-8623-4d01-a878-5db061d6a5b8-kube-api-access-8s24f\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950122 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-config\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950144 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gn68\" (UniqueName: \"kubernetes.io/projected/33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d-kube-api-access-7gn68\") pod \"openshift-apiserver-operator-796bbdcf4f-wjfbd\" (UID: \"33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950166 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-audit-policies\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950189 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950216 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf09db34-1df7-44a2-a584-a032476e4d66-client-ca\") pod \"route-controller-manager-6576b87f9c-lgfgx\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950238 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-serving-cert\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950273 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950295 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bfcb195-48c8-46cd-b417-aacb40f615f4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950311 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06ec1775-ce0a-4a78-b4ea-75de7a931917-config\") pod \"console-operator-58897d9998-znkpp\" (UID: \"06ec1775-ce0a-4a78-b4ea-75de7a931917\") " pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950334 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgw9l\" (UniqueName: \"kubernetes.io/projected/7210df16-765e-4b49-8b67-8989f4b2f15c-kube-api-access-fgw9l\") pod \"machine-api-operator-5694c8668f-8zn9j\" (UID: \"7210df16-765e-4b49-8b67-8989f4b2f15c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950354 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm6vd\" (UniqueName: \"kubernetes.io/projected/b771f4d8-8253-4530-9e1a-e0ca06f263e4-kube-api-access-jm6vd\") pod \"machine-approver-56656f9798-sbntq\" (UID: \"b771f4d8-8253-4530-9e1a-e0ca06f263e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950375 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-client-ca\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950398 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950418 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b771f4d8-8253-4530-9e1a-e0ca06f263e4-config\") pod \"machine-approver-56656f9798-sbntq\" (UID: \"b771f4d8-8253-4530-9e1a-e0ca06f263e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950439 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950589 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfcb195-48c8-46cd-b417-aacb40f615f4-config\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950613 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950633 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950655 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06ec1775-ce0a-4a78-b4ea-75de7a931917-serving-cert\") pod \"console-operator-58897d9998-znkpp\" (UID: \"06ec1775-ce0a-4a78-b4ea-75de7a931917\") " pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950674 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/92721dbb-2c2a-448a-801f-579a9d2d9566-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zv69t\" (UID: \"92721dbb-2c2a-448a-801f-579a9d2d9566\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950697 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.950718 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a145826-4023-4211-aa90-aedba31d17c1-audit-dir\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.957732 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-trusted-ca-bundle\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.957835 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfzkl\" (UniqueName: \"kubernetes.io/projected/6a145826-4023-4211-aa90-aedba31d17c1-kube-api-access-rfzkl\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.957875 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wjfbd\" (UID: \"33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.957904 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7210df16-765e-4b49-8b67-8989f4b2f15c-images\") pod \"machine-api-operator-5694c8668f-8zn9j\" (UID: \"7210df16-765e-4b49-8b67-8989f4b2f15c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.957932 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf09db34-1df7-44a2-a584-a032476e4d66-config\") pod \"route-controller-manager-6576b87f9c-lgfgx\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.957956 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcfj6\" (UniqueName: \"kubernetes.io/projected/bf09db34-1df7-44a2-a584-a032476e4d66-kube-api-access-kcfj6\") pod \"route-controller-manager-6576b87f9c-lgfgx\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.957984 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wjfbd\" (UID: \"33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.958415 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.959357 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.959533 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.960059 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.960413 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.960416 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-service-ca\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.960558 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.960597 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.960805 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bfcb195-48c8-46cd-b417-aacb40f615f4-serving-cert\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.960833 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bfcb195-48c8-46cd-b417-aacb40f615f4-service-ca-bundle\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.960853 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf09db34-1df7-44a2-a584-a032476e4d66-serving-cert\") pod \"route-controller-manager-6576b87f9c-lgfgx\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.961242 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-config\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.961322 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-oauth-serving-cert\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.961349 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7210df16-765e-4b49-8b67-8989f4b2f15c-config\") pod \"machine-api-operator-5694c8668f-8zn9j\" (UID: \"7210df16-765e-4b49-8b67-8989f4b2f15c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.961505 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj859\" (UniqueName: \"kubernetes.io/projected/06ec1775-ce0a-4a78-b4ea-75de7a931917-kube-api-access-dj859\") pod \"console-operator-58897d9998-znkpp\" (UID: \"06ec1775-ce0a-4a78-b4ea-75de7a931917\") " pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.961639 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtdc6\" (UniqueName: \"kubernetes.io/projected/dc52d338-32a0-4072-8b02-578a41f8b3bc-kube-api-access-gtdc6\") pod \"cluster-samples-operator-665b6dd947-4jjbg\" (UID: \"dc52d338-32a0-4072-8b02-578a41f8b3bc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.961698 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.961733 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.961754 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b771f4d8-8253-4530-9e1a-e0ca06f263e4-machine-approver-tls\") pod \"machine-approver-56656f9798-sbntq\" (UID: \"b771f4d8-8253-4530-9e1a-e0ca06f263e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.962697 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.962893 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.963243 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.963462 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.964219 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.966317 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.967223 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.969277 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.979427 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.980311 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.981291 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.983995 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533908-nq4vc"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.984703 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533908-nq4vc" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.985527 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.986523 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.990613 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-l66s9"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.991665 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l66s9" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.993113 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.994014 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.995286 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.995925 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.996534 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-57xqt"] Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.997454 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-57xqt" Feb 25 15:49:43 crc kubenswrapper[4937]: I0225 15:49:43.998270 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.005205 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.007215 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.007448 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.008103 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.008673 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-b4jvp"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.009824 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-b4jvp" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.012562 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.013246 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.013769 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r5bpn"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.014244 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.015787 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.016856 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pdnqk"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.016916 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.017643 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.017910 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-q2wd7"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.018367 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.018837 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.019897 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.020650 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qfghw"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.021655 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-znkpp"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.022755 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wlfqx"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.022875 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.023939 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.025434 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8zn9j"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.026499 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.027453 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-29fxd"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.028505 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vd8vf"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.029657 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.030574 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-pplcq"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.031365 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pplcq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.032001 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.035177 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zv69t"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.037206 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.038390 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.039552 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-djs85"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.040855 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rcxdq"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.041764 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.041973 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.043608 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7drd4"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.044921 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-l66s9"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.046837 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.051247 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.055938 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-57xqt"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.064545 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-48pnj"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.065810 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.065848 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-node-pullsecrets\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.065871 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18ba8725-4b8c-4dcb-b0d9-3d07364d5c30-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzvb\" (UID: \"18ba8725-4b8c-4dcb-b0d9-3d07364d5c30\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.065973 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8214555c-5d28-43a6-8033-afe1e5a16c54-trusted-ca\") pod \"ingress-operator-5b745b69d9-gltdt\" (UID: \"8214555c-5d28-43a6-8033-afe1e5a16c54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.066104 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.066146 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bfcb195-48c8-46cd-b417-aacb40f615f4-serving-cert\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.066176 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bfcb195-48c8-46cd-b417-aacb40f615f4-service-ca-bundle\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.066201 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf09db34-1df7-44a2-a584-a032476e4d66-serving-cert\") pod \"route-controller-manager-6576b87f9c-lgfgx\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.066228 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f8285777-1554-41ed-8fef-daf8637a4c5d-encryption-config\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.066227 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.066252 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69ffb\" (UniqueName: \"kubernetes.io/projected/a8883398-bb74-4223-bde1-bd53e899926a-kube-api-access-69ffb\") pod \"service-ca-9c57cc56f-57xqt\" (UID: \"a8883398-bb74-4223-bde1-bd53e899926a\") " pod="openshift-service-ca/service-ca-9c57cc56f-57xqt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.066281 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ba8725-4b8c-4dcb-b0d9-3d07364d5c30-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzvb\" (UID: \"18ba8725-4b8c-4dcb-b0d9-3d07364d5c30\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.066309 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13-serving-cert\") pod \"service-ca-operator-777779d784-rnsf5\" (UID: \"bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.066337 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-config\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.066432 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-oauth-serving-cert\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.066495 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7210df16-765e-4b49-8b67-8989f4b2f15c-config\") pod \"machine-api-operator-5694c8668f-8zn9j\" (UID: \"7210df16-765e-4b49-8b67-8989f4b2f15c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.067117 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.067146 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc560d1e-8ebe-4c2d-8597-21407baf4406-config\") pod \"kube-controller-manager-operator-78b949d7b-gvtds\" (UID: \"cc560d1e-8ebe-4c2d-8597-21407baf4406\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.067537 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a8883398-bb74-4223-bde1-bd53e899926a-signing-cabundle\") pod \"service-ca-9c57cc56f-57xqt\" (UID: \"a8883398-bb74-4223-bde1-bd53e899926a\") " pod="openshift-service-ca/service-ca-9c57cc56f-57xqt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.067621 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj859\" (UniqueName: \"kubernetes.io/projected/06ec1775-ce0a-4a78-b4ea-75de7a931917-kube-api-access-dj859\") pod \"console-operator-58897d9998-znkpp\" (UID: \"06ec1775-ce0a-4a78-b4ea-75de7a931917\") " pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.067702 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-etcd-client\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.067776 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psg9g\" (UniqueName: \"kubernetes.io/projected/7734eeb2-8011-4c7d-9614-e63f8d93b189-kube-api-access-psg9g\") pod \"package-server-manager-789f6589d5-mzv4j\" (UID: \"7734eeb2-8011-4c7d-9614-e63f8d93b189\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.067917 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtdc6\" (UniqueName: \"kubernetes.io/projected/dc52d338-32a0-4072-8b02-578a41f8b3bc-kube-api-access-gtdc6\") pod \"cluster-samples-operator-665b6dd947-4jjbg\" (UID: \"dc52d338-32a0-4072-8b02-578a41f8b3bc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.067990 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/127a885b-d7f5-47ed-890d-159a75a7f79e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6mrcl\" (UID: \"127a885b-d7f5-47ed-890d-159a75a7f79e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.068220 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.068372 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.068448 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b771f4d8-8253-4530-9e1a-e0ca06f263e4-machine-approver-tls\") pod \"machine-approver-56656f9798-sbntq\" (UID: \"b771f4d8-8253-4530-9e1a-e0ca06f263e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.068321 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-oauth-serving-cert\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.068696 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-serving-cert\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.068763 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8285777-1554-41ed-8fef-daf8637a4c5d-audit-dir\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.069028 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-oauth-config\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.069196 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b771f4d8-8253-4530-9e1a-e0ca06f263e4-auth-proxy-config\") pod \"machine-approver-56656f9798-sbntq\" (UID: \"b771f4d8-8253-4530-9e1a-e0ca06f263e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.069222 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8285777-1554-41ed-8fef-daf8637a4c5d-audit-policies\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.069443 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-config\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.069775 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7210df16-765e-4b49-8b67-8989f4b2f15c-config\") pod \"machine-api-operator-5694c8668f-8zn9j\" (UID: \"7210df16-765e-4b49-8b67-8989f4b2f15c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.069912 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.070017 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.070198 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b771f4d8-8253-4530-9e1a-e0ca06f263e4-auth-proxy-config\") pod \"machine-approver-56656f9798-sbntq\" (UID: \"b771f4d8-8253-4530-9e1a-e0ca06f263e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.070405 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-786cs\" (UniqueName: \"kubernetes.io/projected/ff089f24-3d05-4c97-b6f7-3a39cbec049f-kube-api-access-786cs\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.070436 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpjv8\" (UniqueName: \"kubernetes.io/projected/d33e6a6a-98b5-4eb8-8de5-8138395b48cb-kube-api-access-hpjv8\") pod \"dns-operator-744455d44c-qfghw\" (UID: \"d33e6a6a-98b5-4eb8-8de5-8138395b48cb\") " pod="openshift-dns-operator/dns-operator-744455d44c-qfghw" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.070469 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/430d304c-8623-4d01-a878-5db061d6a5b8-serving-cert\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.070527 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.070584 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6bzm\" (UniqueName: \"kubernetes.io/projected/18ba8725-4b8c-4dcb-b0d9-3d07364d5c30-kube-api-access-d6bzm\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzvb\" (UID: \"18ba8725-4b8c-4dcb-b0d9-3d07364d5c30\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.071066 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bfcb195-48c8-46cd-b417-aacb40f615f4-service-ca-bundle\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.071383 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p57qc\" (UniqueName: \"kubernetes.io/projected/d9c49432-4c74-4842-bdd2-880414a4ad0a-kube-api-access-p57qc\") pod \"downloads-7954f5f757-vd8vf\" (UID: \"d9c49432-4c74-4842-bdd2-880414a4ad0a\") " pod="openshift-console/downloads-7954f5f757-vd8vf" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.071465 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc560d1e-8ebe-4c2d-8597-21407baf4406-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gvtds\" (UID: \"cc560d1e-8ebe-4c2d-8597-21407baf4406\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.071963 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533908-nq4vc"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.072580 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8959ab96-5694-4631-b3f5-bfcb7213a21d-config\") pod \"kube-apiserver-operator-766d6c64bb-bl268\" (UID: \"8959ab96-5694-4631-b3f5-bfcb7213a21d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.072625 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.072625 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w6lq\" (UniqueName: \"kubernetes.io/projected/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-kube-api-access-9w6lq\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.072682 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8214555c-5d28-43a6-8033-afe1e5a16c54-metrics-tls\") pod \"ingress-operator-5b745b69d9-gltdt\" (UID: \"8214555c-5d28-43a6-8033-afe1e5a16c54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.072745 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm5cj\" (UniqueName: \"kubernetes.io/projected/2bfcb195-48c8-46cd-b417-aacb40f615f4-kube-api-access-cm5cj\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.072788 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06ec1775-ce0a-4a78-b4ea-75de7a931917-trusted-ca\") pod \"console-operator-58897d9998-znkpp\" (UID: \"06ec1775-ce0a-4a78-b4ea-75de7a931917\") " pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.072826 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.072854 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8285777-1554-41ed-8fef-daf8637a4c5d-serving-cert\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.072883 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7734eeb2-8011-4c7d-9614-e63f8d93b189-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mzv4j\" (UID: \"7734eeb2-8011-4c7d-9614-e63f8d93b189\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.072999 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7210df16-765e-4b49-8b67-8989f4b2f15c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8zn9j\" (UID: \"7210df16-765e-4b49-8b67-8989f4b2f15c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.073615 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d33e6a6a-98b5-4eb8-8de5-8138395b48cb-metrics-tls\") pod \"dns-operator-744455d44c-qfghw\" (UID: \"d33e6a6a-98b5-4eb8-8de5-8138395b48cb\") " pod="openshift-dns-operator/dns-operator-744455d44c-qfghw" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.073650 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc52d338-32a0-4072-8b02-578a41f8b3bc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4jjbg\" (UID: \"dc52d338-32a0-4072-8b02-578a41f8b3bc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074060 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92721dbb-2c2a-448a-801f-579a9d2d9566-serving-cert\") pod \"openshift-config-operator-7777fb866f-zv69t\" (UID: \"92721dbb-2c2a-448a-801f-579a9d2d9566\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074108 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dff4k\" (UniqueName: \"kubernetes.io/projected/92721dbb-2c2a-448a-801f-579a9d2d9566-kube-api-access-dff4k\") pod \"openshift-config-operator-7777fb866f-zv69t\" (UID: \"92721dbb-2c2a-448a-801f-579a9d2d9566\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074221 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06ec1775-ce0a-4a78-b4ea-75de7a931917-trusted-ca\") pod \"console-operator-58897d9998-znkpp\" (UID: \"06ec1775-ce0a-4a78-b4ea-75de7a931917\") " pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074272 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s24f\" (UniqueName: \"kubernetes.io/projected/430d304c-8623-4d01-a878-5db061d6a5b8-kube-api-access-8s24f\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074295 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074316 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8959ab96-5694-4631-b3f5-bfcb7213a21d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bl268\" (UID: \"8959ab96-5694-4631-b3f5-bfcb7213a21d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074345 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp947\" (UniqueName: \"kubernetes.io/projected/f8285777-1554-41ed-8fef-daf8637a4c5d-kube-api-access-kp947\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074371 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mmkj\" (UniqueName: \"kubernetes.io/projected/bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13-kube-api-access-6mmkj\") pod \"service-ca-operator-777779d784-rnsf5\" (UID: \"bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074403 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-config\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074426 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-encryption-config\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074460 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-audit-dir\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074561 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f8285777-1554-41ed-8fef-daf8637a4c5d-etcd-client\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074760 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gn68\" (UniqueName: \"kubernetes.io/projected/33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d-kube-api-access-7gn68\") pod \"openshift-apiserver-operator-796bbdcf4f-wjfbd\" (UID: \"33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074847 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/430d304c-8623-4d01-a878-5db061d6a5b8-serving-cert\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074896 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf09db34-1df7-44a2-a584-a032476e4d66-serving-cert\") pod \"route-controller-manager-6576b87f9c-lgfgx\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074964 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-audit-policies\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.074999 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8285777-1554-41ed-8fef-daf8637a4c5d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075040 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075071 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf09db34-1df7-44a2-a584-a032476e4d66-client-ca\") pod \"route-controller-manager-6576b87f9c-lgfgx\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075099 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-audit\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075127 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8214555c-5d28-43a6-8033-afe1e5a16c54-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gltdt\" (UID: \"8214555c-5d28-43a6-8033-afe1e5a16c54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075153 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bfcb195-48c8-46cd-b417-aacb40f615f4-serving-cert\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075183 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-serving-cert\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075402 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-oauth-config\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075475 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-config\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075558 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-image-import-ca\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075615 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075630 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-audit-policies\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075661 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bfcb195-48c8-46cd-b417-aacb40f615f4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075682 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-config\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075749 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06ec1775-ce0a-4a78-b4ea-75de7a931917-config\") pod \"console-operator-58897d9998-znkpp\" (UID: \"06ec1775-ce0a-4a78-b4ea-75de7a931917\") " pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075839 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa374000-00c7-43a6-b3a4-1ced809e17e9-config\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075862 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a8883398-bb74-4223-bde1-bd53e899926a-signing-key\") pod \"service-ca-9c57cc56f-57xqt\" (UID: \"a8883398-bb74-4223-bde1-bd53e899926a\") " pod="openshift-service-ca/service-ca-9c57cc56f-57xqt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075900 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgw9l\" (UniqueName: \"kubernetes.io/projected/7210df16-765e-4b49-8b67-8989f4b2f15c-kube-api-access-fgw9l\") pod \"machine-api-operator-5694c8668f-8zn9j\" (UID: \"7210df16-765e-4b49-8b67-8989f4b2f15c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075916 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.075923 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm6vd\" (UniqueName: \"kubernetes.io/projected/b771f4d8-8253-4530-9e1a-e0ca06f263e4-kube-api-access-jm6vd\") pod \"machine-approver-56656f9798-sbntq\" (UID: \"b771f4d8-8253-4530-9e1a-e0ca06f263e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.076016 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7g6f\" (UniqueName: \"kubernetes.io/projected/127a885b-d7f5-47ed-890d-159a75a7f79e-kube-api-access-r7g6f\") pod \"machine-config-controller-84d6567774-6mrcl\" (UID: \"127a885b-d7f5-47ed-890d-159a75a7f79e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.076107 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-client-ca\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.076175 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.076247 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b771f4d8-8253-4530-9e1a-e0ca06f263e4-config\") pod \"machine-approver-56656f9798-sbntq\" (UID: \"b771f4d8-8253-4530-9e1a-e0ca06f263e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.076786 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b771f4d8-8253-4530-9e1a-e0ca06f263e4-config\") pod \"machine-approver-56656f9798-sbntq\" (UID: \"b771f4d8-8253-4530-9e1a-e0ca06f263e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.076817 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-client-ca\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.076833 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.076871 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfcb195-48c8-46cd-b417-aacb40f615f4-config\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.076896 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa374000-00c7-43a6-b3a4-1ced809e17e9-serving-cert\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.076993 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aa374000-00c7-43a6-b3a4-1ced809e17e9-etcd-service-ca\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077029 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077042 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf09db34-1df7-44a2-a584-a032476e4d66-client-ca\") pod \"route-controller-manager-6576b87f9c-lgfgx\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077073 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bfcb195-48c8-46cd-b417-aacb40f615f4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077092 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077167 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077170 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06ec1775-ce0a-4a78-b4ea-75de7a931917-serving-cert\") pod \"console-operator-58897d9998-znkpp\" (UID: \"06ec1775-ce0a-4a78-b4ea-75de7a931917\") " pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077209 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/92721dbb-2c2a-448a-801f-579a9d2d9566-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zv69t\" (UID: \"92721dbb-2c2a-448a-801f-579a9d2d9566\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077234 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aa374000-00c7-43a6-b3a4-1ced809e17e9-etcd-ca\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077553 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/92721dbb-2c2a-448a-801f-579a9d2d9566-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zv69t\" (UID: \"92721dbb-2c2a-448a-801f-579a9d2d9566\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077581 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077645 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a145826-4023-4211-aa90-aedba31d17c1-audit-dir\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077682 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kpkv\" (UniqueName: \"kubernetes.io/projected/aa374000-00c7-43a6-b3a4-1ced809e17e9-kube-api-access-2kpkv\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077713 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8959ab96-5694-4631-b3f5-bfcb7213a21d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bl268\" (UID: \"8959ab96-5694-4631-b3f5-bfcb7213a21d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077761 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a145826-4023-4211-aa90-aedba31d17c1-audit-dir\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077789 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-trusted-ca-bundle\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.077880 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/127a885b-d7f5-47ed-890d-159a75a7f79e-proxy-tls\") pod \"machine-config-controller-84d6567774-6mrcl\" (UID: \"127a885b-d7f5-47ed-890d-159a75a7f79e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.078254 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfzkl\" (UniqueName: \"kubernetes.io/projected/6a145826-4023-4211-aa90-aedba31d17c1-kube-api-access-rfzkl\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.078369 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wjfbd\" (UID: \"33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.078525 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.078816 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7210df16-765e-4b49-8b67-8989f4b2f15c-images\") pod \"machine-api-operator-5694c8668f-8zn9j\" (UID: \"7210df16-765e-4b49-8b67-8989f4b2f15c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.078933 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf09db34-1df7-44a2-a584-a032476e4d66-config\") pod \"route-controller-manager-6576b87f9c-lgfgx\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.079022 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcfj6\" (UniqueName: \"kubernetes.io/projected/bf09db34-1df7-44a2-a584-a032476e4d66-kube-api-access-kcfj6\") pod \"route-controller-manager-6576b87f9c-lgfgx\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.079111 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc560d1e-8ebe-4c2d-8597-21407baf4406-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gvtds\" (UID: \"cc560d1e-8ebe-4c2d-8597-21407baf4406\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.079181 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfcb195-48c8-46cd-b417-aacb40f615f4-config\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.079236 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.079298 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f8285777-1554-41ed-8fef-daf8637a4c5d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.079391 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13-config\") pod \"service-ca-operator-777779d784-rnsf5\" (UID: \"bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.079449 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d33e6a6a-98b5-4eb8-8de5-8138395b48cb-metrics-tls\") pod \"dns-operator-744455d44c-qfghw\" (UID: \"d33e6a6a-98b5-4eb8-8de5-8138395b48cb\") " pod="openshift-dns-operator/dns-operator-744455d44c-qfghw" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.079561 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-trusted-ca-bundle\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.079564 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ghqz\" (UniqueName: \"kubernetes.io/projected/8214555c-5d28-43a6-8033-afe1e5a16c54-kube-api-access-6ghqz\") pod \"ingress-operator-5b745b69d9-gltdt\" (UID: \"8214555c-5d28-43a6-8033-afe1e5a16c54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.079644 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wjfbd\" (UID: \"33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.079690 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-service-ca\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.079710 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aa374000-00c7-43a6-b3a4-1ced809e17e9-etcd-client\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.079752 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-etcd-serving-ca\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.079544 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7210df16-765e-4b49-8b67-8989f4b2f15c-images\") pod \"machine-api-operator-5694c8668f-8zn9j\" (UID: \"7210df16-765e-4b49-8b67-8989f4b2f15c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.080428 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wjfbd\" (UID: \"33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.080600 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-service-ca\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.081146 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.081687 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92721dbb-2c2a-448a-801f-579a9d2d9566-serving-cert\") pod \"openshift-config-operator-7777fb866f-zv69t\" (UID: \"92721dbb-2c2a-448a-801f-579a9d2d9566\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.082045 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7210df16-765e-4b49-8b67-8989f4b2f15c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8zn9j\" (UID: \"7210df16-765e-4b49-8b67-8989f4b2f15c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.082084 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf09db34-1df7-44a2-a584-a032476e4d66-config\") pod \"route-controller-manager-6576b87f9c-lgfgx\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.082226 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-serving-cert\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.082822 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wjfbd\" (UID: \"33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.083321 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.086196 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.084901 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.083894 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc52d338-32a0-4072-8b02-578a41f8b3bc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4jjbg\" (UID: \"dc52d338-32a0-4072-8b02-578a41f8b3bc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.085090 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b771f4d8-8253-4530-9e1a-e0ca06f263e4-machine-approver-tls\") pod \"machine-approver-56656f9798-sbntq\" (UID: \"b771f4d8-8253-4530-9e1a-e0ca06f263e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.085382 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06ec1775-ce0a-4a78-b4ea-75de7a931917-config\") pod \"console-operator-58897d9998-znkpp\" (UID: \"06ec1775-ce0a-4a78-b4ea-75de7a931917\") " pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.087005 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r5bpn"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.087976 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.088370 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.088659 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.088667 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.088955 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.088995 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.089383 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.089860 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dxb6p"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.090637 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dxb6p" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.091128 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.092437 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.093717 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.095139 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6r6g7"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.096617 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dxb6p"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.097288 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06ec1775-ce0a-4a78-b4ea-75de7a931917-serving-cert\") pod \"console-operator-58897d9998-znkpp\" (UID: \"06ec1775-ce0a-4a78-b4ea-75de7a931917\") " pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.098192 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.099603 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.101226 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-b4jvp"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.101377 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.102402 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pdnqk"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.103974 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-48pnj"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.105399 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9nd26"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.106367 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9nd26" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.107011 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9nd26"] Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.121637 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.141867 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.161710 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181020 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8959ab96-5694-4631-b3f5-bfcb7213a21d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bl268\" (UID: \"8959ab96-5694-4631-b3f5-bfcb7213a21d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181059 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mmkj\" (UniqueName: \"kubernetes.io/projected/bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13-kube-api-access-6mmkj\") pod \"service-ca-operator-777779d784-rnsf5\" (UID: \"bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181080 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-encryption-config\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181098 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f8285777-1554-41ed-8fef-daf8637a4c5d-etcd-client\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181124 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8214555c-5d28-43a6-8033-afe1e5a16c54-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gltdt\" (UID: \"8214555c-5d28-43a6-8033-afe1e5a16c54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181145 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-config\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181165 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-image-import-ca\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181183 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a8883398-bb74-4223-bde1-bd53e899926a-signing-key\") pod \"service-ca-9c57cc56f-57xqt\" (UID: \"a8883398-bb74-4223-bde1-bd53e899926a\") " pod="openshift-service-ca/service-ca-9c57cc56f-57xqt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181207 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl2v2\" (UniqueName: \"kubernetes.io/projected/dba71048-faea-4ee2-bec3-70c2fa66a7e8-kube-api-access-nl2v2\") pod \"multus-admission-controller-857f4d67dd-b4jvp\" (UID: \"dba71048-faea-4ee2-bec3-70c2fa66a7e8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b4jvp" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181240 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7g6f\" (UniqueName: \"kubernetes.io/projected/127a885b-d7f5-47ed-890d-159a75a7f79e-kube-api-access-r7g6f\") pod \"machine-config-controller-84d6567774-6mrcl\" (UID: \"127a885b-d7f5-47ed-890d-159a75a7f79e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181259 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gs4z\" (UniqueName: \"kubernetes.io/projected/b2a49754-862e-459c-bc50-f1b1b67cb4ea-kube-api-access-6gs4z\") pod \"cluster-image-registry-operator-dc59b4c8b-fnvmt\" (UID: \"b2a49754-862e-459c-bc50-f1b1b67cb4ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181277 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aa374000-00c7-43a6-b3a4-1ced809e17e9-etcd-ca\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181295 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8959ab96-5694-4631-b3f5-bfcb7213a21d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bl268\" (UID: \"8959ab96-5694-4631-b3f5-bfcb7213a21d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181319 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f8285777-1554-41ed-8fef-daf8637a4c5d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181362 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13-config\") pod \"service-ca-operator-777779d784-rnsf5\" (UID: \"bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181381 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ghqz\" (UniqueName: \"kubernetes.io/projected/8214555c-5d28-43a6-8033-afe1e5a16c54-kube-api-access-6ghqz\") pod \"ingress-operator-5b745b69d9-gltdt\" (UID: \"8214555c-5d28-43a6-8033-afe1e5a16c54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181396 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aa374000-00c7-43a6-b3a4-1ced809e17e9-etcd-client\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181412 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-etcd-serving-ca\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181429 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2a49754-862e-459c-bc50-f1b1b67cb4ea-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fnvmt\" (UID: \"b2a49754-862e-459c-bc50-f1b1b67cb4ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181446 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8214555c-5d28-43a6-8033-afe1e5a16c54-trusted-ca\") pod \"ingress-operator-5b745b69d9-gltdt\" (UID: \"8214555c-5d28-43a6-8033-afe1e5a16c54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181465 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f8285777-1554-41ed-8fef-daf8637a4c5d-encryption-config\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181501 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69ffb\" (UniqueName: \"kubernetes.io/projected/a8883398-bb74-4223-bde1-bd53e899926a-kube-api-access-69ffb\") pod \"service-ca-9c57cc56f-57xqt\" (UID: \"a8883398-bb74-4223-bde1-bd53e899926a\") " pod="openshift-service-ca/service-ca-9c57cc56f-57xqt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181544 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/db1a84a9-f1b0-4dff-befd-796aebf7284b-certs\") pod \"machine-config-server-pplcq\" (UID: \"db1a84a9-f1b0-4dff-befd-796aebf7284b\") " pod="openshift-machine-config-operator/machine-config-server-pplcq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181564 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtqts\" (UniqueName: \"kubernetes.io/projected/db1a84a9-f1b0-4dff-befd-796aebf7284b-kube-api-access-mtqts\") pod \"machine-config-server-pplcq\" (UID: \"db1a84a9-f1b0-4dff-befd-796aebf7284b\") " pod="openshift-machine-config-operator/machine-config-server-pplcq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181584 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13-serving-cert\") pod \"service-ca-operator-777779d784-rnsf5\" (UID: \"bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181603 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc560d1e-8ebe-4c2d-8597-21407baf4406-config\") pod \"kube-controller-manager-operator-78b949d7b-gvtds\" (UID: \"cc560d1e-8ebe-4c2d-8597-21407baf4406\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181620 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-etcd-client\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181638 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psg9g\" (UniqueName: \"kubernetes.io/projected/7734eeb2-8011-4c7d-9614-e63f8d93b189-kube-api-access-psg9g\") pod \"package-server-manager-789f6589d5-mzv4j\" (UID: \"7734eeb2-8011-4c7d-9614-e63f8d93b189\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181657 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/127a885b-d7f5-47ed-890d-159a75a7f79e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6mrcl\" (UID: \"127a885b-d7f5-47ed-890d-159a75a7f79e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181686 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8285777-1554-41ed-8fef-daf8637a4c5d-audit-dir\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181706 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6bzm\" (UniqueName: \"kubernetes.io/projected/18ba8725-4b8c-4dcb-b0d9-3d07364d5c30-kube-api-access-d6bzm\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzvb\" (UID: \"18ba8725-4b8c-4dcb-b0d9-3d07364d5c30\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181725 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8959ab96-5694-4631-b3f5-bfcb7213a21d-config\") pod \"kube-apiserver-operator-766d6c64bb-bl268\" (UID: \"8959ab96-5694-4631-b3f5-bfcb7213a21d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181742 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181760 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7734eeb2-8011-4c7d-9614-e63f8d93b189-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mzv4j\" (UID: \"7734eeb2-8011-4c7d-9614-e63f8d93b189\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181785 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp947\" (UniqueName: \"kubernetes.io/projected/f8285777-1554-41ed-8fef-daf8637a4c5d-kube-api-access-kp947\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181803 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-audit-dir\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181819 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8285777-1554-41ed-8fef-daf8637a4c5d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181837 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-audit\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181863 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa374000-00c7-43a6-b3a4-1ced809e17e9-config\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181881 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/db1a84a9-f1b0-4dff-befd-796aebf7284b-node-bootstrap-token\") pod \"machine-config-server-pplcq\" (UID: \"db1a84a9-f1b0-4dff-befd-796aebf7284b\") " pod="openshift-machine-config-operator/machine-config-server-pplcq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181897 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa374000-00c7-43a6-b3a4-1ced809e17e9-serving-cert\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181913 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aa374000-00c7-43a6-b3a4-1ced809e17e9-etcd-service-ca\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181931 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kpkv\" (UniqueName: \"kubernetes.io/projected/aa374000-00c7-43a6-b3a4-1ced809e17e9-kube-api-access-2kpkv\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181962 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/127a885b-d7f5-47ed-890d-159a75a7f79e-proxy-tls\") pod \"machine-config-controller-84d6567774-6mrcl\" (UID: \"127a885b-d7f5-47ed-890d-159a75a7f79e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.181986 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc560d1e-8ebe-4c2d-8597-21407baf4406-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gvtds\" (UID: \"cc560d1e-8ebe-4c2d-8597-21407baf4406\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.182004 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-node-pullsecrets\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.182021 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18ba8725-4b8c-4dcb-b0d9-3d07364d5c30-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzvb\" (UID: \"18ba8725-4b8c-4dcb-b0d9-3d07364d5c30\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.182041 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ba8725-4b8c-4dcb-b0d9-3d07364d5c30-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzvb\" (UID: \"18ba8725-4b8c-4dcb-b0d9-3d07364d5c30\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.182060 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a8883398-bb74-4223-bde1-bd53e899926a-signing-cabundle\") pod \"service-ca-9c57cc56f-57xqt\" (UID: \"a8883398-bb74-4223-bde1-bd53e899926a\") " pod="openshift-service-ca/service-ca-9c57cc56f-57xqt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.182091 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-serving-cert\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.182109 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dba71048-faea-4ee2-bec3-70c2fa66a7e8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-b4jvp\" (UID: \"dba71048-faea-4ee2-bec3-70c2fa66a7e8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b4jvp" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.182126 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8285777-1554-41ed-8fef-daf8637a4c5d-audit-policies\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.182166 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc560d1e-8ebe-4c2d-8597-21407baf4406-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gvtds\" (UID: \"cc560d1e-8ebe-4c2d-8597-21407baf4406\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.182186 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2a49754-862e-459c-bc50-f1b1b67cb4ea-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fnvmt\" (UID: \"b2a49754-862e-459c-bc50-f1b1b67cb4ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.182206 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w6lq\" (UniqueName: \"kubernetes.io/projected/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-kube-api-access-9w6lq\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.182222 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8214555c-5d28-43a6-8033-afe1e5a16c54-metrics-tls\") pod \"ingress-operator-5b745b69d9-gltdt\" (UID: \"8214555c-5d28-43a6-8033-afe1e5a16c54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.182245 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2a49754-862e-459c-bc50-f1b1b67cb4ea-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fnvmt\" (UID: \"b2a49754-862e-459c-bc50-f1b1b67cb4ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.182263 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8285777-1554-41ed-8fef-daf8637a4c5d-serving-cert\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.182952 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8285777-1554-41ed-8fef-daf8637a4c5d-audit-dir\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.183749 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/127a885b-d7f5-47ed-890d-159a75a7f79e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6mrcl\" (UID: \"127a885b-d7f5-47ed-890d-159a75a7f79e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.183995 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f8285777-1554-41ed-8fef-daf8637a4c5d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.184413 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8285777-1554-41ed-8fef-daf8637a4c5d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.184569 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-audit-dir\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.184582 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-etcd-serving-ca\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.184704 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aa374000-00c7-43a6-b3a4-1ced809e17e9-etcd-service-ca\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.185457 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8285777-1554-41ed-8fef-daf8637a4c5d-audit-policies\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.185548 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa374000-00c7-43a6-b3a4-1ced809e17e9-config\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.185757 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-image-import-ca\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.185769 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8959ab96-5694-4631-b3f5-bfcb7213a21d-config\") pod \"kube-apiserver-operator-766d6c64bb-bl268\" (UID: \"8959ab96-5694-4631-b3f5-bfcb7213a21d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.186046 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-audit\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.186070 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-serving-cert\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.186080 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.186363 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-node-pullsecrets\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.187339 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-config\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.187451 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8285777-1554-41ed-8fef-daf8637a4c5d-serving-cert\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.187507 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aa374000-00c7-43a6-b3a4-1ced809e17e9-etcd-ca\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.187687 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc560d1e-8ebe-4c2d-8597-21407baf4406-config\") pod \"kube-controller-manager-operator-78b949d7b-gvtds\" (UID: \"cc560d1e-8ebe-4c2d-8597-21407baf4406\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.187882 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa374000-00c7-43a6-b3a4-1ced809e17e9-serving-cert\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.188225 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aa374000-00c7-43a6-b3a4-1ced809e17e9-etcd-client\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.188291 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f8285777-1554-41ed-8fef-daf8637a4c5d-encryption-config\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.190093 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-etcd-client\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.190241 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-encryption-config\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.190474 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc560d1e-8ebe-4c2d-8597-21407baf4406-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gvtds\" (UID: \"cc560d1e-8ebe-4c2d-8597-21407baf4406\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.190817 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f8285777-1554-41ed-8fef-daf8637a4c5d-etcd-client\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.191530 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8959ab96-5694-4631-b3f5-bfcb7213a21d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bl268\" (UID: \"8959ab96-5694-4631-b3f5-bfcb7213a21d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.202361 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.211569 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/127a885b-d7f5-47ed-890d-159a75a7f79e-proxy-tls\") pod \"machine-config-controller-84d6567774-6mrcl\" (UID: \"127a885b-d7f5-47ed-890d-159a75a7f79e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.221283 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.242151 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.262049 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.273296 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18ba8725-4b8c-4dcb-b0d9-3d07364d5c30-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzvb\" (UID: \"18ba8725-4b8c-4dcb-b0d9-3d07364d5c30\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.282218 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.283289 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dba71048-faea-4ee2-bec3-70c2fa66a7e8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-b4jvp\" (UID: \"dba71048-faea-4ee2-bec3-70c2fa66a7e8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b4jvp" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.283344 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2a49754-862e-459c-bc50-f1b1b67cb4ea-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fnvmt\" (UID: \"b2a49754-862e-459c-bc50-f1b1b67cb4ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.283400 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2a49754-862e-459c-bc50-f1b1b67cb4ea-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fnvmt\" (UID: \"b2a49754-862e-459c-bc50-f1b1b67cb4ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.283464 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl2v2\" (UniqueName: \"kubernetes.io/projected/dba71048-faea-4ee2-bec3-70c2fa66a7e8-kube-api-access-nl2v2\") pod \"multus-admission-controller-857f4d67dd-b4jvp\" (UID: \"dba71048-faea-4ee2-bec3-70c2fa66a7e8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b4jvp" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.283524 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gs4z\" (UniqueName: \"kubernetes.io/projected/b2a49754-862e-459c-bc50-f1b1b67cb4ea-kube-api-access-6gs4z\") pod \"cluster-image-registry-operator-dc59b4c8b-fnvmt\" (UID: \"b2a49754-862e-459c-bc50-f1b1b67cb4ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.283570 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2a49754-862e-459c-bc50-f1b1b67cb4ea-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fnvmt\" (UID: \"b2a49754-862e-459c-bc50-f1b1b67cb4ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.283595 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtqts\" (UniqueName: \"kubernetes.io/projected/db1a84a9-f1b0-4dff-befd-796aebf7284b-kube-api-access-mtqts\") pod \"machine-config-server-pplcq\" (UID: \"db1a84a9-f1b0-4dff-befd-796aebf7284b\") " pod="openshift-machine-config-operator/machine-config-server-pplcq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.283618 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/db1a84a9-f1b0-4dff-befd-796aebf7284b-certs\") pod \"machine-config-server-pplcq\" (UID: \"db1a84a9-f1b0-4dff-befd-796aebf7284b\") " pod="openshift-machine-config-operator/machine-config-server-pplcq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.283710 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/db1a84a9-f1b0-4dff-befd-796aebf7284b-node-bootstrap-token\") pod \"machine-config-server-pplcq\" (UID: \"db1a84a9-f1b0-4dff-befd-796aebf7284b\") " pod="openshift-machine-config-operator/machine-config-server-pplcq" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.301179 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.307178 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ba8725-4b8c-4dcb-b0d9-3d07364d5c30-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzvb\" (UID: \"18ba8725-4b8c-4dcb-b0d9-3d07364d5c30\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.321239 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.341409 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.360979 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.366962 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.367093 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.367040 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.380873 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.400841 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.422456 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.441299 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.447693 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8214555c-5d28-43a6-8033-afe1e5a16c54-metrics-tls\") pod \"ingress-operator-5b745b69d9-gltdt\" (UID: \"8214555c-5d28-43a6-8033-afe1e5a16c54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.470849 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.477808 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8214555c-5d28-43a6-8033-afe1e5a16c54-trusted-ca\") pod \"ingress-operator-5b745b69d9-gltdt\" (UID: \"8214555c-5d28-43a6-8033-afe1e5a16c54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.481781 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.501833 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.521443 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.541546 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.561705 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.580888 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.601067 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.621706 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.641715 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.661501 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.682945 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.701848 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.722045 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.742215 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.761779 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.792238 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.795621 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2a49754-862e-459c-bc50-f1b1b67cb4ea-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fnvmt\" (UID: \"b2a49754-862e-459c-bc50-f1b1b67cb4ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.801985 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.822367 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.827076 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2a49754-862e-459c-bc50-f1b1b67cb4ea-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fnvmt\" (UID: \"b2a49754-862e-459c-bc50-f1b1b67cb4ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.841686 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.861450 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.881743 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.890557 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13-serving-cert\") pod \"service-ca-operator-777779d784-rnsf5\" (UID: \"bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.903118 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.904858 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13-config\") pod \"service-ca-operator-777779d784-rnsf5\" (UID: \"bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.923141 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.941306 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.960785 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.981765 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 25 15:49:44 crc kubenswrapper[4937]: I0225 15:49:44.993071 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a8883398-bb74-4223-bde1-bd53e899926a-signing-key\") pod \"service-ca-9c57cc56f-57xqt\" (UID: \"a8883398-bb74-4223-bde1-bd53e899926a\") " pod="openshift-service-ca/service-ca-9c57cc56f-57xqt" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.000100 4937 request.go:700] Waited for 1.002280664s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/configmaps?fieldSelector=metadata.name%3Dsigning-cabundle&limit=500&resourceVersion=0 Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.002098 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.009355 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a8883398-bb74-4223-bde1-bd53e899926a-signing-cabundle\") pod \"service-ca-9c57cc56f-57xqt\" (UID: \"a8883398-bb74-4223-bde1-bd53e899926a\") " pod="openshift-service-ca/service-ca-9c57cc56f-57xqt" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.022707 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.042861 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.061359 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.082017 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.101298 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.122565 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.142378 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.148160 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dba71048-faea-4ee2-bec3-70c2fa66a7e8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-b4jvp\" (UID: \"dba71048-faea-4ee2-bec3-70c2fa66a7e8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b4jvp" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.162402 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.183048 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 25 15:49:45 crc kubenswrapper[4937]: E0225 15:49:45.186064 4937 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 25 15:49:45 crc kubenswrapper[4937]: E0225 15:49:45.186210 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7734eeb2-8011-4c7d-9614-e63f8d93b189-package-server-manager-serving-cert podName:7734eeb2-8011-4c7d-9614-e63f8d93b189 nodeName:}" failed. No retries permitted until 2026-02-25 15:49:45.686176559 +0000 UTC m=+236.699568469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/7734eeb2-8011-4c7d-9614-e63f8d93b189-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-mzv4j" (UID: "7734eeb2-8011-4c7d-9614-e63f8d93b189") : failed to sync secret cache: timed out waiting for the condition Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.202284 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.222569 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.261769 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.262422 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.282141 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 25 15:49:45 crc kubenswrapper[4937]: E0225 15:49:45.284550 4937 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Feb 25 15:49:45 crc kubenswrapper[4937]: E0225 15:49:45.284719 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db1a84a9-f1b0-4dff-befd-796aebf7284b-certs podName:db1a84a9-f1b0-4dff-befd-796aebf7284b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:45.784695243 +0000 UTC m=+236.798087153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/db1a84a9-f1b0-4dff-befd-796aebf7284b-certs") pod "machine-config-server-pplcq" (UID: "db1a84a9-f1b0-4dff-befd-796aebf7284b") : failed to sync secret cache: timed out waiting for the condition Feb 25 15:49:45 crc kubenswrapper[4937]: E0225 15:49:45.285126 4937 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Feb 25 15:49:45 crc kubenswrapper[4937]: E0225 15:49:45.285313 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db1a84a9-f1b0-4dff-befd-796aebf7284b-node-bootstrap-token podName:db1a84a9-f1b0-4dff-befd-796aebf7284b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:45.785296678 +0000 UTC m=+236.798688578 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/db1a84a9-f1b0-4dff-befd-796aebf7284b-node-bootstrap-token") pod "machine-config-server-pplcq" (UID: "db1a84a9-f1b0-4dff-befd-796aebf7284b") : failed to sync secret cache: timed out waiting for the condition Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.301346 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.322150 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.342238 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.361804 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.382244 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.401310 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.422215 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.441521 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.461920 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.484512 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.501986 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.521915 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.542699 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.561058 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.583510 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.621847 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.645146 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.645517 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.662823 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.703011 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.703730 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7734eeb2-8011-4c7d-9614-e63f8d93b189-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mzv4j\" (UID: \"7734eeb2-8011-4c7d-9614-e63f8d93b189\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.712973 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7734eeb2-8011-4c7d-9614-e63f8d93b189-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mzv4j\" (UID: \"7734eeb2-8011-4c7d-9614-e63f8d93b189\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.737456 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj859\" (UniqueName: \"kubernetes.io/projected/06ec1775-ce0a-4a78-b4ea-75de7a931917-kube-api-access-dj859\") pod \"console-operator-58897d9998-znkpp\" (UID: \"06ec1775-ce0a-4a78-b4ea-75de7a931917\") " pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.741866 4937 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.762005 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.798525 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtdc6\" (UniqueName: \"kubernetes.io/projected/dc52d338-32a0-4072-8b02-578a41f8b3bc-kube-api-access-gtdc6\") pod \"cluster-samples-operator-665b6dd947-4jjbg\" (UID: \"dc52d338-32a0-4072-8b02-578a41f8b3bc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.799811 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.807068 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/db1a84a9-f1b0-4dff-befd-796aebf7284b-certs\") pod \"machine-config-server-pplcq\" (UID: \"db1a84a9-f1b0-4dff-befd-796aebf7284b\") " pod="openshift-machine-config-operator/machine-config-server-pplcq" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.807745 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/db1a84a9-f1b0-4dff-befd-796aebf7284b-node-bootstrap-token\") pod \"machine-config-server-pplcq\" (UID: \"db1a84a9-f1b0-4dff-befd-796aebf7284b\") " pod="openshift-machine-config-operator/machine-config-server-pplcq" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.813436 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/db1a84a9-f1b0-4dff-befd-796aebf7284b-certs\") pod \"machine-config-server-pplcq\" (UID: \"db1a84a9-f1b0-4dff-befd-796aebf7284b\") " pod="openshift-machine-config-operator/machine-config-server-pplcq" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.815740 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/db1a84a9-f1b0-4dff-befd-796aebf7284b-node-bootstrap-token\") pod \"machine-config-server-pplcq\" (UID: \"db1a84a9-f1b0-4dff-befd-796aebf7284b\") " pod="openshift-machine-config-operator/machine-config-server-pplcq" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.821034 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-786cs\" (UniqueName: \"kubernetes.io/projected/ff089f24-3d05-4c97-b6f7-3a39cbec049f-kube-api-access-786cs\") pod \"console-f9d7485db-djs85\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.829677 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.851869 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p57qc\" (UniqueName: \"kubernetes.io/projected/d9c49432-4c74-4842-bdd2-880414a4ad0a-kube-api-access-p57qc\") pod \"downloads-7954f5f757-vd8vf\" (UID: \"d9c49432-4c74-4842-bdd2-880414a4ad0a\") " pod="openshift-console/downloads-7954f5f757-vd8vf" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.885590 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpjv8\" (UniqueName: \"kubernetes.io/projected/d33e6a6a-98b5-4eb8-8de5-8138395b48cb-kube-api-access-hpjv8\") pod \"dns-operator-744455d44c-qfghw\" (UID: \"d33e6a6a-98b5-4eb8-8de5-8138395b48cb\") " pod="openshift-dns-operator/dns-operator-744455d44c-qfghw" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.903329 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s24f\" (UniqueName: \"kubernetes.io/projected/430d304c-8623-4d01-a878-5db061d6a5b8-kube-api-access-8s24f\") pod \"controller-manager-879f6c89f-29fxd\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.916158 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.921022 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dff4k\" (UniqueName: \"kubernetes.io/projected/92721dbb-2c2a-448a-801f-579a9d2d9566-kube-api-access-dff4k\") pod \"openshift-config-operator-7777fb866f-zv69t\" (UID: \"92721dbb-2c2a-448a-801f-579a9d2d9566\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.941685 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gn68\" (UniqueName: \"kubernetes.io/projected/33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d-kube-api-access-7gn68\") pod \"openshift-apiserver-operator-796bbdcf4f-wjfbd\" (UID: \"33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.953938 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm5cj\" (UniqueName: \"kubernetes.io/projected/2bfcb195-48c8-46cd-b417-aacb40f615f4-kube-api-access-cm5cj\") pod \"authentication-operator-69f744f599-wlfqx\" (UID: \"2bfcb195-48c8-46cd-b417-aacb40f615f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.962550 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm6vd\" (UniqueName: \"kubernetes.io/projected/b771f4d8-8253-4530-9e1a-e0ca06f263e4-kube-api-access-jm6vd\") pod \"machine-approver-56656f9798-sbntq\" (UID: \"b771f4d8-8253-4530-9e1a-e0ca06f263e4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.965275 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qfghw" Feb 25 15:49:45 crc kubenswrapper[4937]: I0225 15:49:45.980954 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.000152 4937 request.go:700] Waited for 1.920397862s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.000215 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.003103 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgw9l\" (UniqueName: \"kubernetes.io/projected/7210df16-765e-4b49-8b67-8989f4b2f15c-kube-api-access-fgw9l\") pod \"machine-api-operator-5694c8668f-8zn9j\" (UID: \"7210df16-765e-4b49-8b67-8989f4b2f15c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.003874 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfzkl\" (UniqueName: \"kubernetes.io/projected/6a145826-4023-4211-aa90-aedba31d17c1-kube-api-access-rfzkl\") pod \"oauth-openshift-558db77b4-6r6g7\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.021801 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.026962 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcfj6\" (UniqueName: \"kubernetes.io/projected/bf09db34-1df7-44a2-a584-a032476e4d66-kube-api-access-kcfj6\") pod \"route-controller-manager-6576b87f9c-lgfgx\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.048731 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.053598 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.090993 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.091133 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vd8vf" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.091844 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.098403 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.098547 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.112078 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.116710 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.122587 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.142387 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.186201 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kpkv\" (UniqueName: \"kubernetes.io/projected/aa374000-00c7-43a6-b3a4-1ced809e17e9-kube-api-access-2kpkv\") pod \"etcd-operator-b45778765-rcxdq\" (UID: \"aa374000-00c7-43a6-b3a4-1ced809e17e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.205421 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ghqz\" (UniqueName: \"kubernetes.io/projected/8214555c-5d28-43a6-8033-afe1e5a16c54-kube-api-access-6ghqz\") pod \"ingress-operator-5b745b69d9-gltdt\" (UID: \"8214555c-5d28-43a6-8033-afe1e5a16c54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.224416 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7g6f\" (UniqueName: \"kubernetes.io/projected/127a885b-d7f5-47ed-890d-159a75a7f79e-kube-api-access-r7g6f\") pod \"machine-config-controller-84d6567774-6mrcl\" (UID: \"127a885b-d7f5-47ed-890d-159a75a7f79e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.235229 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.236521 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.245330 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w6lq\" (UniqueName: \"kubernetes.io/projected/0f9fc900-6cf7-4890-8e7d-6925e9e3862f-kube-api-access-9w6lq\") pod \"apiserver-76f77b778f-7drd4\" (UID: \"0f9fc900-6cf7-4890-8e7d-6925e9e3862f\") " pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.263530 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp947\" (UniqueName: \"kubernetes.io/projected/f8285777-1554-41ed-8fef-daf8637a4c5d-kube-api-access-kp947\") pod \"apiserver-7bbb656c7d-5jprg\" (UID: \"f8285777-1554-41ed-8fef-daf8637a4c5d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.277698 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6bzm\" (UniqueName: \"kubernetes.io/projected/18ba8725-4b8c-4dcb-b0d9-3d07364d5c30-kube-api-access-d6bzm\") pod \"openshift-controller-manager-operator-756b6f6bc6-mhzvb\" (UID: \"18ba8725-4b8c-4dcb-b0d9-3d07364d5c30\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.291230 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.304420 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc560d1e-8ebe-4c2d-8597-21407baf4406-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gvtds\" (UID: \"cc560d1e-8ebe-4c2d-8597-21407baf4406\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.318960 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69ffb\" (UniqueName: \"kubernetes.io/projected/a8883398-bb74-4223-bde1-bd53e899926a-kube-api-access-69ffb\") pod \"service-ca-9c57cc56f-57xqt\" (UID: \"a8883398-bb74-4223-bde1-bd53e899926a\") " pod="openshift-service-ca/service-ca-9c57cc56f-57xqt" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.337187 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mmkj\" (UniqueName: \"kubernetes.io/projected/bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13-kube-api-access-6mmkj\") pod \"service-ca-operator-777779d784-rnsf5\" (UID: \"bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.343892 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.352793 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-57xqt" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.354812 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-djs85"] Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.366585 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8959ab96-5694-4631-b3f5-bfcb7213a21d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bl268\" (UID: \"8959ab96-5694-4631-b3f5-bfcb7213a21d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.384479 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psg9g\" (UniqueName: \"kubernetes.io/projected/7734eeb2-8011-4c7d-9614-e63f8d93b189-kube-api-access-psg9g\") pod \"package-server-manager-789f6589d5-mzv4j\" (UID: \"7734eeb2-8011-4c7d-9614-e63f8d93b189\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.401793 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8214555c-5d28-43a6-8033-afe1e5a16c54-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gltdt\" (UID: \"8214555c-5d28-43a6-8033-afe1e5a16c54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.419135 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.440848 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.448554 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl2v2\" (UniqueName: \"kubernetes.io/projected/dba71048-faea-4ee2-bec3-70c2fa66a7e8-kube-api-access-nl2v2\") pod \"multus-admission-controller-857f4d67dd-b4jvp\" (UID: \"dba71048-faea-4ee2-bec3-70c2fa66a7e8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b4jvp" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.457700 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.457711 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.468855 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.470360 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2a49754-862e-459c-bc50-f1b1b67cb4ea-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fnvmt\" (UID: \"b2a49754-862e-459c-bc50-f1b1b67cb4ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.486393 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtqts\" (UniqueName: \"kubernetes.io/projected/db1a84a9-f1b0-4dff-befd-796aebf7284b-kube-api-access-mtqts\") pod \"machine-config-server-pplcq\" (UID: \"db1a84a9-f1b0-4dff-befd-796aebf7284b\") " pod="openshift-machine-config-operator/machine-config-server-pplcq" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.508402 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.521716 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gs4z\" (UniqueName: \"kubernetes.io/projected/b2a49754-862e-459c-bc50-f1b1b67cb4ea-kube-api-access-6gs4z\") pod \"cluster-image-registry-operator-dc59b4c8b-fnvmt\" (UID: \"b2a49754-862e-459c-bc50-f1b1b67cb4ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.521897 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-djs85" event={"ID":"ff089f24-3d05-4c97-b6f7-3a39cbec049f","Type":"ContainerStarted","Data":"85ad2c9be1c0d93692e33a0f01838d85a804c6ecfc1da4810c3df73376f3da2c"} Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.522811 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.524527 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.545684 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.547273 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.553280 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" event={"ID":"b771f4d8-8253-4530-9e1a-e0ca06f263e4","Type":"ContainerStarted","Data":"3c91e4d2e527aa49019c0de50bc98c261e14c4e3f978c7d1eb9feab46ba1f88f"} Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.562874 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.564936 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zv69t"] Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.564956 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.614443 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg"] Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.625779 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-29fxd"] Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.636106 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.637694 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-trusted-ca\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.637758 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b1dc13b-9b02-42b0-a00e-21f15f9f98a2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dz785\" (UID: \"9b1dc13b-9b02-42b0-a00e-21f15f9f98a2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.637795 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.637817 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a22f23f6-fdc2-4842-bd45-3dd5695c48c6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vk5tb\" (UID: \"a22f23f6-fdc2-4842-bd45-3dd5695c48c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.637843 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-registry-certificates\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.637887 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.637940 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49fa31be-2461-4113-96b3-a1da363827c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-crxdn\" (UID: \"49fa31be-2461-4113-96b3-a1da363827c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:46 crc kubenswrapper[4937]: E0225 15:49:46.639821 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:47.139795826 +0000 UTC m=+238.153187706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.662986 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qfghw"] Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.663028 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wlfqx"] Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.665400 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6r6g7"] Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.665511 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a22f23f6-fdc2-4842-bd45-3dd5695c48c6-proxy-tls\") pod \"machine-config-operator-74547568cd-vk5tb\" (UID: \"a22f23f6-fdc2-4842-bd45-3dd5695c48c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.665573 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f275x\" (UniqueName: \"kubernetes.io/projected/49fa31be-2461-4113-96b3-a1da363827c7-kube-api-access-f275x\") pod \"packageserver-d55dfcdfc-crxdn\" (UID: \"49fa31be-2461-4113-96b3-a1da363827c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.665644 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-registry-tls\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.665680 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt27k\" (UniqueName: \"kubernetes.io/projected/a22f23f6-fdc2-4842-bd45-3dd5695c48c6-kube-api-access-bt27k\") pod \"machine-config-operator-74547568cd-vk5tb\" (UID: \"a22f23f6-fdc2-4842-bd45-3dd5695c48c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.665718 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng78b\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-kube-api-access-ng78b\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.665741 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a22f23f6-fdc2-4842-bd45-3dd5695c48c6-images\") pod \"machine-config-operator-74547568cd-vk5tb\" (UID: \"a22f23f6-fdc2-4842-bd45-3dd5695c48c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.665764 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49fa31be-2461-4113-96b3-a1da363827c7-webhook-cert\") pod \"packageserver-d55dfcdfc-crxdn\" (UID: \"49fa31be-2461-4113-96b3-a1da363827c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.665787 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d9c7aa27-d268-45e4-be93-97de8f8dfb8c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fzfv4\" (UID: \"d9c7aa27-d268-45e4-be93-97de8f8dfb8c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.665833 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.665856 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbbbf\" (UniqueName: \"kubernetes.io/projected/9b1dc13b-9b02-42b0-a00e-21f15f9f98a2-kube-api-access-cbbbf\") pod \"control-plane-machine-set-operator-78cbb6b69f-dz785\" (UID: \"9b1dc13b-9b02-42b0-a00e-21f15f9f98a2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.665879 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/49fa31be-2461-4113-96b3-a1da363827c7-tmpfs\") pod \"packageserver-d55dfcdfc-crxdn\" (UID: \"49fa31be-2461-4113-96b3-a1da363827c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.665899 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-bound-sa-token\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.665931 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d9c7aa27-d268-45e4-be93-97de8f8dfb8c-srv-cert\") pod \"olm-operator-6b444d44fb-fzfv4\" (UID: \"d9c7aa27-d268-45e4-be93-97de8f8dfb8c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.690887 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-b4jvp" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.755138 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vd8vf"] Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.762088 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd"] Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.764150 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-znkpp"] Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.764317 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pplcq" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.766420 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.766655 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/727110e8-1674-467e-b39a-0fca0b874523-profile-collector-cert\") pod \"catalog-operator-68c6474976-hhvn4\" (UID: \"727110e8-1674-467e-b39a-0fca0b874523\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.766688 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8g7sk\" (UID: \"701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.766718 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-registry-certificates\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.766746 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-csi-data-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.766806 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fcbm\" (UniqueName: \"kubernetes.io/projected/42f414e1-fd0d-4e08-8783-f68ae63af8c8-kube-api-access-5fcbm\") pod \"migrator-59844c95c7-l66s9\" (UID: \"42f414e1-fd0d-4e08-8783-f68ae63af8c8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l66s9" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.766840 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49fa31be-2461-4113-96b3-a1da363827c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-crxdn\" (UID: \"49fa31be-2461-4113-96b3-a1da363827c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.766872 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/906509ff-be49-4c28-95b5-9f80cb885ece-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r5bpn\" (UID: \"906509ff-be49-4c28-95b5-9f80cb885ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.766891 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a22f23f6-fdc2-4842-bd45-3dd5695c48c6-proxy-tls\") pod \"machine-config-operator-74547568cd-vk5tb\" (UID: \"a22f23f6-fdc2-4842-bd45-3dd5695c48c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.766911 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-mountpoint-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.766926 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/727110e8-1674-467e-b39a-0fca0b874523-srv-cert\") pod \"catalog-operator-68c6474976-hhvn4\" (UID: \"727110e8-1674-467e-b39a-0fca0b874523\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.766942 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f275x\" (UniqueName: \"kubernetes.io/projected/49fa31be-2461-4113-96b3-a1da363827c7-kube-api-access-f275x\") pod \"packageserver-d55dfcdfc-crxdn\" (UID: \"49fa31be-2461-4113-96b3-a1da363827c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.766970 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54bfd5df-332a-44ff-9da6-5a2f6a29083c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dt5kp\" (UID: \"54bfd5df-332a-44ff-9da6-5a2f6a29083c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.766986 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvhhm\" (UniqueName: \"kubernetes.io/projected/906509ff-be49-4c28-95b5-9f80cb885ece-kube-api-access-hvhhm\") pod \"marketplace-operator-79b997595-r5bpn\" (UID: \"906509ff-be49-4c28-95b5-9f80cb885ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767024 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-registry-tls\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767067 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt27k\" (UniqueName: \"kubernetes.io/projected/a22f23f6-fdc2-4842-bd45-3dd5695c48c6-kube-api-access-bt27k\") pod \"machine-config-operator-74547568cd-vk5tb\" (UID: \"a22f23f6-fdc2-4842-bd45-3dd5695c48c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767086 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a22f23f6-fdc2-4842-bd45-3dd5695c48c6-images\") pod \"machine-config-operator-74547568cd-vk5tb\" (UID: \"a22f23f6-fdc2-4842-bd45-3dd5695c48c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767104 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-config-volume\") pod \"collect-profiles-29533905-jckrw\" (UID: \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767121 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e21358c1-fad3-42c2-982d-8f3e50fadc34-default-certificate\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767140 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng78b\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-kube-api-access-ng78b\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767159 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qznfh\" (UniqueName: \"kubernetes.io/projected/e21358c1-fad3-42c2-982d-8f3e50fadc34-kube-api-access-qznfh\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767174 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49fa31be-2461-4113-96b3-a1da363827c7-webhook-cert\") pod \"packageserver-d55dfcdfc-crxdn\" (UID: \"49fa31be-2461-4113-96b3-a1da363827c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767191 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d9c7aa27-d268-45e4-be93-97de8f8dfb8c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fzfv4\" (UID: \"d9c7aa27-d268-45e4-be93-97de8f8dfb8c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767208 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvl4r\" (UniqueName: \"kubernetes.io/projected/54bfd5df-332a-44ff-9da6-5a2f6a29083c-kube-api-access-bvl4r\") pod \"kube-storage-version-migrator-operator-b67b599dd-dt5kp\" (UID: \"54bfd5df-332a-44ff-9da6-5a2f6a29083c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767234 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54bfd5df-332a-44ff-9da6-5a2f6a29083c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dt5kp\" (UID: \"54bfd5df-332a-44ff-9da6-5a2f6a29083c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767259 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d5cd\" (UniqueName: \"kubernetes.io/projected/d9c7aa27-d268-45e4-be93-97de8f8dfb8c-kube-api-access-4d5cd\") pod \"olm-operator-6b444d44fb-fzfv4\" (UID: \"d9c7aa27-d268-45e4-be93-97de8f8dfb8c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767275 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-secret-volume\") pod \"collect-profiles-29533905-jckrw\" (UID: \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767290 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/906509ff-be49-4c28-95b5-9f80cb885ece-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r5bpn\" (UID: \"906509ff-be49-4c28-95b5-9f80cb885ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767329 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-plugins-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767355 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767381 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbbbf\" (UniqueName: \"kubernetes.io/projected/9b1dc13b-9b02-42b0-a00e-21f15f9f98a2-kube-api-access-cbbbf\") pod \"control-plane-machine-set-operator-78cbb6b69f-dz785\" (UID: \"9b1dc13b-9b02-42b0-a00e-21f15f9f98a2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767398 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e21358c1-fad3-42c2-982d-8f3e50fadc34-service-ca-bundle\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767416 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/49fa31be-2461-4113-96b3-a1da363827c7-tmpfs\") pod \"packageserver-d55dfcdfc-crxdn\" (UID: \"49fa31be-2461-4113-96b3-a1da363827c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767430 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-registration-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767447 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-bound-sa-token\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767462 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e21358c1-fad3-42c2-982d-8f3e50fadc34-metrics-certs\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767479 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-socket-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767525 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gskxf\" (UniqueName: \"kubernetes.io/projected/727110e8-1674-467e-b39a-0fca0b874523-kube-api-access-gskxf\") pod \"catalog-operator-68c6474976-hhvn4\" (UID: \"727110e8-1674-467e-b39a-0fca0b874523\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767542 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8g7sk\" (UID: \"701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767558 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d9c7aa27-d268-45e4-be93-97de8f8dfb8c-srv-cert\") pod \"olm-operator-6b444d44fb-fzfv4\" (UID: \"d9c7aa27-d268-45e4-be93-97de8f8dfb8c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767576 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-trusted-ca\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767594 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmp7m\" (UniqueName: \"kubernetes.io/projected/0c63f82a-9346-476b-ae17-edb260b2a36f-kube-api-access-qmp7m\") pod \"auto-csr-approver-29533908-nq4vc\" (UID: \"0c63f82a-9346-476b-ae17-edb260b2a36f\") " pod="openshift-infra/auto-csr-approver-29533908-nq4vc" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767613 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b1dc13b-9b02-42b0-a00e-21f15f9f98a2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dz785\" (UID: \"9b1dc13b-9b02-42b0-a00e-21f15f9f98a2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767640 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gw42\" (UniqueName: \"kubernetes.io/projected/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-kube-api-access-2gw42\") pod \"collect-profiles-29533905-jckrw\" (UID: \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767677 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767698 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a22f23f6-fdc2-4842-bd45-3dd5695c48c6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vk5tb\" (UID: \"a22f23f6-fdc2-4842-bd45-3dd5695c48c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767718 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6dz4\" (UniqueName: \"kubernetes.io/projected/816999fe-cb2a-4f9b-b546-ca866b5aec3a-kube-api-access-k6dz4\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767734 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e21358c1-fad3-42c2-982d-8f3e50fadc34-stats-auth\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.767775 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8g7sk\" (UID: \"701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.768783 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.768984 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a22f23f6-fdc2-4842-bd45-3dd5695c48c6-images\") pod \"machine-config-operator-74547568cd-vk5tb\" (UID: \"a22f23f6-fdc2-4842-bd45-3dd5695c48c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.769190 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a22f23f6-fdc2-4842-bd45-3dd5695c48c6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vk5tb\" (UID: \"a22f23f6-fdc2-4842-bd45-3dd5695c48c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.769580 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-registry-certificates\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.770168 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/49fa31be-2461-4113-96b3-a1da363827c7-tmpfs\") pod \"packageserver-d55dfcdfc-crxdn\" (UID: \"49fa31be-2461-4113-96b3-a1da363827c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.778213 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d9c7aa27-d268-45e4-be93-97de8f8dfb8c-srv-cert\") pod \"olm-operator-6b444d44fb-fzfv4\" (UID: \"d9c7aa27-d268-45e4-be93-97de8f8dfb8c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" Feb 25 15:49:46 crc kubenswrapper[4937]: E0225 15:49:46.778879 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:47.278846801 +0000 UTC m=+238.292238701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.779424 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a22f23f6-fdc2-4842-bd45-3dd5695c48c6-proxy-tls\") pod \"machine-config-operator-74547568cd-vk5tb\" (UID: \"a22f23f6-fdc2-4842-bd45-3dd5695c48c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.780309 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49fa31be-2461-4113-96b3-a1da363827c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-crxdn\" (UID: \"49fa31be-2461-4113-96b3-a1da363827c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.783728 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49fa31be-2461-4113-96b3-a1da363827c7-webhook-cert\") pod \"packageserver-d55dfcdfc-crxdn\" (UID: \"49fa31be-2461-4113-96b3-a1da363827c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.790287 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d9c7aa27-d268-45e4-be93-97de8f8dfb8c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fzfv4\" (UID: \"d9c7aa27-d268-45e4-be93-97de8f8dfb8c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.790298 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-registry-tls\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.790600 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.791385 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-trusted-ca\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.796048 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b1dc13b-9b02-42b0-a00e-21f15f9f98a2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dz785\" (UID: \"9b1dc13b-9b02-42b0-a00e-21f15f9f98a2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.805695 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng78b\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-kube-api-access-ng78b\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.824073 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbbbf\" (UniqueName: \"kubernetes.io/projected/9b1dc13b-9b02-42b0-a00e-21f15f9f98a2-kube-api-access-cbbbf\") pod \"control-plane-machine-set-operator-78cbb6b69f-dz785\" (UID: \"9b1dc13b-9b02-42b0-a00e-21f15f9f98a2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.844934 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-bound-sa-token\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.866976 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f275x\" (UniqueName: \"kubernetes.io/projected/49fa31be-2461-4113-96b3-a1da363827c7-kube-api-access-f275x\") pod \"packageserver-d55dfcdfc-crxdn\" (UID: \"49fa31be-2461-4113-96b3-a1da363827c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.870252 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6vfg\" (UniqueName: \"kubernetes.io/projected/cd38451a-2713-4257-ac27-d6f304d6c0fc-kube-api-access-r6vfg\") pod \"ingress-canary-dxb6p\" (UID: \"cd38451a-2713-4257-ac27-d6f304d6c0fc\") " pod="openshift-ingress-canary/ingress-canary-dxb6p" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.870443 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-config-volume\") pod \"collect-profiles-29533905-jckrw\" (UID: \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.870476 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e21358c1-fad3-42c2-982d-8f3e50fadc34-default-certificate\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.870537 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qznfh\" (UniqueName: \"kubernetes.io/projected/e21358c1-fad3-42c2-982d-8f3e50fadc34-kube-api-access-qznfh\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.870625 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvl4r\" (UniqueName: \"kubernetes.io/projected/54bfd5df-332a-44ff-9da6-5a2f6a29083c-kube-api-access-bvl4r\") pod \"kube-storage-version-migrator-operator-b67b599dd-dt5kp\" (UID: \"54bfd5df-332a-44ff-9da6-5a2f6a29083c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.870684 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54bfd5df-332a-44ff-9da6-5a2f6a29083c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dt5kp\" (UID: \"54bfd5df-332a-44ff-9da6-5a2f6a29083c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.870716 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d5cd\" (UniqueName: \"kubernetes.io/projected/d9c7aa27-d268-45e4-be93-97de8f8dfb8c-kube-api-access-4d5cd\") pod \"olm-operator-6b444d44fb-fzfv4\" (UID: \"d9c7aa27-d268-45e4-be93-97de8f8dfb8c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.872296 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-config-volume\") pod \"collect-profiles-29533905-jckrw\" (UID: \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.876972 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-secret-volume\") pod \"collect-profiles-29533905-jckrw\" (UID: \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.877029 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/906509ff-be49-4c28-95b5-9f80cb885ece-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r5bpn\" (UID: \"906509ff-be49-4c28-95b5-9f80cb885ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.877109 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54bfd5df-332a-44ff-9da6-5a2f6a29083c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dt5kp\" (UID: \"54bfd5df-332a-44ff-9da6-5a2f6a29083c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.877444 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-plugins-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.877818 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-plugins-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.878646 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/906509ff-be49-4c28-95b5-9f80cb885ece-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r5bpn\" (UID: \"906509ff-be49-4c28-95b5-9f80cb885ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879281 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e21358c1-fad3-42c2-982d-8f3e50fadc34-service-ca-bundle\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879342 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-registration-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879377 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e21358c1-fad3-42c2-982d-8f3e50fadc34-metrics-certs\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879420 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-socket-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879471 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gskxf\" (UniqueName: \"kubernetes.io/projected/727110e8-1674-467e-b39a-0fca0b874523-kube-api-access-gskxf\") pod \"catalog-operator-68c6474976-hhvn4\" (UID: \"727110e8-1674-467e-b39a-0fca0b874523\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879523 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8g7sk\" (UID: \"701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879582 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/62b20b63-dccd-4dd2-94c5-c89bf4d87585-metrics-tls\") pod \"dns-default-9nd26\" (UID: \"62b20b63-dccd-4dd2-94c5-c89bf4d87585\") " pod="openshift-dns/dns-default-9nd26" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879622 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmp7m\" (UniqueName: \"kubernetes.io/projected/0c63f82a-9346-476b-ae17-edb260b2a36f-kube-api-access-qmp7m\") pod \"auto-csr-approver-29533908-nq4vc\" (UID: \"0c63f82a-9346-476b-ae17-edb260b2a36f\") " pod="openshift-infra/auto-csr-approver-29533908-nq4vc" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879667 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gw42\" (UniqueName: \"kubernetes.io/projected/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-kube-api-access-2gw42\") pod \"collect-profiles-29533905-jckrw\" (UID: \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879698 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e21358c1-fad3-42c2-982d-8f3e50fadc34-stats-auth\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879746 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6dz4\" (UniqueName: \"kubernetes.io/projected/816999fe-cb2a-4f9b-b546-ca866b5aec3a-kube-api-access-k6dz4\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879788 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8g7sk\" (UID: \"701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879818 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62b20b63-dccd-4dd2-94c5-c89bf4d87585-config-volume\") pod \"dns-default-9nd26\" (UID: \"62b20b63-dccd-4dd2-94c5-c89bf4d87585\") " pod="openshift-dns/dns-default-9nd26" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879845 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd38451a-2713-4257-ac27-d6f304d6c0fc-cert\") pod \"ingress-canary-dxb6p\" (UID: \"cd38451a-2713-4257-ac27-d6f304d6c0fc\") " pod="openshift-ingress-canary/ingress-canary-dxb6p" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879874 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/727110e8-1674-467e-b39a-0fca0b874523-profile-collector-cert\") pod \"catalog-operator-68c6474976-hhvn4\" (UID: \"727110e8-1674-467e-b39a-0fca0b874523\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879900 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8g7sk\" (UID: \"701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879936 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wpfw\" (UniqueName: \"kubernetes.io/projected/62b20b63-dccd-4dd2-94c5-c89bf4d87585-kube-api-access-8wpfw\") pod \"dns-default-9nd26\" (UID: \"62b20b63-dccd-4dd2-94c5-c89bf4d87585\") " pod="openshift-dns/dns-default-9nd26" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.879963 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-csi-data-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.880009 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.880087 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fcbm\" (UniqueName: \"kubernetes.io/projected/42f414e1-fd0d-4e08-8783-f68ae63af8c8-kube-api-access-5fcbm\") pod \"migrator-59844c95c7-l66s9\" (UID: \"42f414e1-fd0d-4e08-8783-f68ae63af8c8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l66s9" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.880143 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/906509ff-be49-4c28-95b5-9f80cb885ece-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r5bpn\" (UID: \"906509ff-be49-4c28-95b5-9f80cb885ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.880188 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-mountpoint-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.880216 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/727110e8-1674-467e-b39a-0fca0b874523-srv-cert\") pod \"catalog-operator-68c6474976-hhvn4\" (UID: \"727110e8-1674-467e-b39a-0fca0b874523\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.880262 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54bfd5df-332a-44ff-9da6-5a2f6a29083c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dt5kp\" (UID: \"54bfd5df-332a-44ff-9da6-5a2f6a29083c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.880313 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvhhm\" (UniqueName: \"kubernetes.io/projected/906509ff-be49-4c28-95b5-9f80cb885ece-kube-api-access-hvhhm\") pod \"marketplace-operator-79b997595-r5bpn\" (UID: \"906509ff-be49-4c28-95b5-9f80cb885ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.880730 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e21358c1-fad3-42c2-982d-8f3e50fadc34-service-ca-bundle\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: E0225 15:49:46.881517 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:47.381501095 +0000 UTC m=+238.394893035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.882019 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-socket-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.888335 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-secret-volume\") pod \"collect-profiles-29533905-jckrw\" (UID: \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.890282 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8g7sk\" (UID: \"701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.890696 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e21358c1-fad3-42c2-982d-8f3e50fadc34-default-certificate\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.890773 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-registration-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.890806 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-mountpoint-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.890904 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/816999fe-cb2a-4f9b-b546-ca866b5aec3a-csi-data-dir\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.891637 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt27k\" (UniqueName: \"kubernetes.io/projected/a22f23f6-fdc2-4842-bd45-3dd5695c48c6-kube-api-access-bt27k\") pod \"machine-config-operator-74547568cd-vk5tb\" (UID: \"a22f23f6-fdc2-4842-bd45-3dd5695c48c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.891651 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8g7sk\" (UID: \"701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.894014 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/727110e8-1674-467e-b39a-0fca0b874523-srv-cert\") pod \"catalog-operator-68c6474976-hhvn4\" (UID: \"727110e8-1674-467e-b39a-0fca0b874523\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.907579 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e21358c1-fad3-42c2-982d-8f3e50fadc34-metrics-certs\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.908519 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/906509ff-be49-4c28-95b5-9f80cb885ece-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r5bpn\" (UID: \"906509ff-be49-4c28-95b5-9f80cb885ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.912611 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e21358c1-fad3-42c2-982d-8f3e50fadc34-stats-auth\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.914083 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl"] Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.925057 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.933957 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54bfd5df-332a-44ff-9da6-5a2f6a29083c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dt5kp\" (UID: \"54bfd5df-332a-44ff-9da6-5a2f6a29083c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.934009 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/727110e8-1674-467e-b39a-0fca0b874523-profile-collector-cert\") pod \"catalog-operator-68c6474976-hhvn4\" (UID: \"727110e8-1674-467e-b39a-0fca0b874523\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.935195 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qznfh\" (UniqueName: \"kubernetes.io/projected/e21358c1-fad3-42c2-982d-8f3e50fadc34-kube-api-access-qznfh\") pod \"router-default-5444994796-q2wd7\" (UID: \"e21358c1-fad3-42c2-982d-8f3e50fadc34\") " pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.950780 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvl4r\" (UniqueName: \"kubernetes.io/projected/54bfd5df-332a-44ff-9da6-5a2f6a29083c-kube-api-access-bvl4r\") pod \"kube-storage-version-migrator-operator-b67b599dd-dt5kp\" (UID: \"54bfd5df-332a-44ff-9da6-5a2f6a29083c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.960829 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.968315 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-57xqt"] Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.971459 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d5cd\" (UniqueName: \"kubernetes.io/projected/d9c7aa27-d268-45e4-be93-97de8f8dfb8c-kube-api-access-4d5cd\") pod \"olm-operator-6b444d44fb-fzfv4\" (UID: \"d9c7aa27-d268-45e4-be93-97de8f8dfb8c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.980299 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5"] Feb 25 15:49:46 crc kubenswrapper[4937]: E0225 15:49:46.981979 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:47.481955706 +0000 UTC m=+238.495347596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.981954 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.985166 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.985261 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6vfg\" (UniqueName: \"kubernetes.io/projected/cd38451a-2713-4257-ac27-d6f304d6c0fc-kube-api-access-r6vfg\") pod \"ingress-canary-dxb6p\" (UID: \"cd38451a-2713-4257-ac27-d6f304d6c0fc\") " pod="openshift-ingress-canary/ingress-canary-dxb6p" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.985526 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/62b20b63-dccd-4dd2-94c5-c89bf4d87585-metrics-tls\") pod \"dns-default-9nd26\" (UID: \"62b20b63-dccd-4dd2-94c5-c89bf4d87585\") " pod="openshift-dns/dns-default-9nd26" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.985683 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62b20b63-dccd-4dd2-94c5-c89bf4d87585-config-volume\") pod \"dns-default-9nd26\" (UID: \"62b20b63-dccd-4dd2-94c5-c89bf4d87585\") " pod="openshift-dns/dns-default-9nd26" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.985780 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd38451a-2713-4257-ac27-d6f304d6c0fc-cert\") pod \"ingress-canary-dxb6p\" (UID: \"cd38451a-2713-4257-ac27-d6f304d6c0fc\") " pod="openshift-ingress-canary/ingress-canary-dxb6p" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.985988 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wpfw\" (UniqueName: \"kubernetes.io/projected/62b20b63-dccd-4dd2-94c5-c89bf4d87585-kube-api-access-8wpfw\") pod \"dns-default-9nd26\" (UID: \"62b20b63-dccd-4dd2-94c5-c89bf4d87585\") " pod="openshift-dns/dns-default-9nd26" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.987875 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.986562 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62b20b63-dccd-4dd2-94c5-c89bf4d87585-config-volume\") pod \"dns-default-9nd26\" (UID: \"62b20b63-dccd-4dd2-94c5-c89bf4d87585\") " pod="openshift-dns/dns-default-9nd26" Feb 25 15:49:46 crc kubenswrapper[4937]: E0225 15:49:46.988260 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:47.488232578 +0000 UTC m=+238.501624468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.991205 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6dz4\" (UniqueName: \"kubernetes.io/projected/816999fe-cb2a-4f9b-b546-ca866b5aec3a-kube-api-access-k6dz4\") pod \"csi-hostpathplugin-48pnj\" (UID: \"816999fe-cb2a-4f9b-b546-ca866b5aec3a\") " pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.993778 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/62b20b63-dccd-4dd2-94c5-c89bf4d87585-metrics-tls\") pod \"dns-default-9nd26\" (UID: \"62b20b63-dccd-4dd2-94c5-c89bf4d87585\") " pod="openshift-dns/dns-default-9nd26" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.996699 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd38451a-2713-4257-ac27-d6f304d6c0fc-cert\") pod \"ingress-canary-dxb6p\" (UID: \"cd38451a-2713-4257-ac27-d6f304d6c0fc\") " pod="openshift-ingress-canary/ingress-canary-dxb6p" Feb 25 15:49:46 crc kubenswrapper[4937]: I0225 15:49:46.998683 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gskxf\" (UniqueName: \"kubernetes.io/projected/727110e8-1674-467e-b39a-0fca0b874523-kube-api-access-gskxf\") pod \"catalog-operator-68c6474976-hhvn4\" (UID: \"727110e8-1674-467e-b39a-0fca0b874523\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:46.999945 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.018548 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8g7sk\" (UID: \"701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.045295 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmp7m\" (UniqueName: \"kubernetes.io/projected/0c63f82a-9346-476b-ae17-edb260b2a36f-kube-api-access-qmp7m\") pod \"auto-csr-approver-29533908-nq4vc\" (UID: \"0c63f82a-9346-476b-ae17-edb260b2a36f\") " pod="openshift-infra/auto-csr-approver-29533908-nq4vc" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.046678 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.056956 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.077045 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-48pnj" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.077964 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvhhm\" (UniqueName: \"kubernetes.io/projected/906509ff-be49-4c28-95b5-9f80cb885ece-kube-api-access-hvhhm\") pod \"marketplace-operator-79b997595-r5bpn\" (UID: \"906509ff-be49-4c28-95b5-9f80cb885ece\") " pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.079715 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gw42\" (UniqueName: \"kubernetes.io/projected/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-kube-api-access-2gw42\") pod \"collect-profiles-29533905-jckrw\" (UID: \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.090218 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:47 crc kubenswrapper[4937]: E0225 15:49:47.090571 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:47.590554384 +0000 UTC m=+238.603946274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.102443 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fcbm\" (UniqueName: \"kubernetes.io/projected/42f414e1-fd0d-4e08-8783-f68ae63af8c8-kube-api-access-5fcbm\") pod \"migrator-59844c95c7-l66s9\" (UID: \"42f414e1-fd0d-4e08-8783-f68ae63af8c8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l66s9" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.122685 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.153019 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk" Feb 25 15:49:47 crc kubenswrapper[4937]: W0225 15:49:47.156335 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8883398_bb74_4223_bde1_bd53e899926a.slice/crio-b5d66e3cab485fd2702b3184d2b11ae0b8558622f4b84f1e8ee3c96768296c6b WatchSource:0}: Error finding container b5d66e3cab485fd2702b3184d2b11ae0b8558622f4b84f1e8ee3c96768296c6b: Status 404 returned error can't find the container with id b5d66e3cab485fd2702b3184d2b11ae0b8558622f4b84f1e8ee3c96768296c6b Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.164548 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wpfw\" (UniqueName: \"kubernetes.io/projected/62b20b63-dccd-4dd2-94c5-c89bf4d87585-kube-api-access-8wpfw\") pod \"dns-default-9nd26\" (UID: \"62b20b63-dccd-4dd2-94c5-c89bf4d87585\") " pod="openshift-dns/dns-default-9nd26" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.166277 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6vfg\" (UniqueName: \"kubernetes.io/projected/cd38451a-2713-4257-ac27-d6f304d6c0fc-kube-api-access-r6vfg\") pod \"ingress-canary-dxb6p\" (UID: \"cd38451a-2713-4257-ac27-d6f304d6c0fc\") " pod="openshift-ingress-canary/ingress-canary-dxb6p" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.166882 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.185960 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8zn9j"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.186017 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.191811 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.192274 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.192465 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533908-nq4vc" Feb 25 15:49:47 crc kubenswrapper[4937]: E0225 15:49:47.196856 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:47.696837116 +0000 UTC m=+238.710229006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.227809 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l66s9" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.262174 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7drd4"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.274762 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.295950 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:47 crc kubenswrapper[4937]: E0225 15:49:47.296110 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:47.796095928 +0000 UTC m=+238.809487818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.296367 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:47 crc kubenswrapper[4937]: E0225 15:49:47.296846 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:47.796832286 +0000 UTC m=+238.810224176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.304973 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb"] Feb 25 15:49:47 crc kubenswrapper[4937]: W0225 15:49:47.307370 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf09db34_1df7_44a2_a584_a032476e4d66.slice/crio-9126f0d74fec251ab5643401a4bcc9193542accf0c4b0d5fe602920197cbef3a WatchSource:0}: Error finding container 9126f0d74fec251ab5643401a4bcc9193542accf0c4b0d5fe602920197cbef3a: Status 404 returned error can't find the container with id 9126f0d74fec251ab5643401a4bcc9193542accf0c4b0d5fe602920197cbef3a Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.309985 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:49:47 crc kubenswrapper[4937]: W0225 15:49:47.327582 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7210df16_765e_4b49_8b67_8989f4b2f15c.slice/crio-63bb65614bbe3552a528fa7ed1d946d23cdeefccc02b35684f7fb3e8da6e8797 WatchSource:0}: Error finding container 63bb65614bbe3552a528fa7ed1d946d23cdeefccc02b35684f7fb3e8da6e8797: Status 404 returned error can't find the container with id 63bb65614bbe3552a528fa7ed1d946d23cdeefccc02b35684f7fb3e8da6e8797 Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.327782 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.361833 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.366467 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rcxdq"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.381449 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds"] Feb 25 15:49:47 crc kubenswrapper[4937]: W0225 15:49:47.394059 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8214555c_5d28_43a6_8033_afe1e5a16c54.slice/crio-c3992aed8224d164fe8bc3efd3afe91ce5f850379c005038670d83fc90a276ca WatchSource:0}: Error finding container c3992aed8224d164fe8bc3efd3afe91ce5f850379c005038670d83fc90a276ca: Status 404 returned error can't find the container with id c3992aed8224d164fe8bc3efd3afe91ce5f850379c005038670d83fc90a276ca Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.397183 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:47 crc kubenswrapper[4937]: E0225 15:49:47.397763 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:47.897723027 +0000 UTC m=+238.911114917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.399262 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dxb6p" Feb 25 15:49:47 crc kubenswrapper[4937]: W0225 15:49:47.410815 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa374000_00c7_43a6_b3a4_1ced809e17e9.slice/crio-a752a15727dc09ac5ed3d8ee5805ed816e06b432699aea1244c36a3fe478c22d WatchSource:0}: Error finding container a752a15727dc09ac5ed3d8ee5805ed816e06b432699aea1244c36a3fe478c22d: Status 404 returned error can't find the container with id a752a15727dc09ac5ed3d8ee5805ed816e06b432699aea1244c36a3fe478c22d Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.413758 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9nd26" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.477754 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.498179 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-b4jvp"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.498994 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:47 crc kubenswrapper[4937]: E0225 15:49:47.499309 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:47.999296415 +0000 UTC m=+239.012688305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.599344 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" event={"ID":"2bfcb195-48c8-46cd-b417-aacb40f615f4","Type":"ContainerStarted","Data":"1d0cda94bfd9bdb622a4c82533f7991f5cec72134d524fc6f290d0380b123755"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.599384 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" event={"ID":"2bfcb195-48c8-46cd-b417-aacb40f615f4","Type":"ContainerStarted","Data":"aa09fc324620036e0d23a7a731a101dbf6378d6cdedb7d540c14c19d05cca807"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.599430 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:47 crc kubenswrapper[4937]: E0225 15:49:47.599824 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:48.099809997 +0000 UTC m=+239.113201887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.613767 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qfghw" event={"ID":"d33e6a6a-98b5-4eb8-8de5-8138395b48cb","Type":"ContainerStarted","Data":"65b0aec6985a633739457e8ce153a8d3ec3de94c39b60ac502c44900642041b8"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.614035 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qfghw" event={"ID":"d33e6a6a-98b5-4eb8-8de5-8138395b48cb","Type":"ContainerStarted","Data":"f484567797825151812d66dbeaed2aa4aa95707ef418f395997ed9fe4b7cf7d5"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.615658 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.649768 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" event={"ID":"aa374000-00c7-43a6-b3a4-1ced809e17e9","Type":"ContainerStarted","Data":"a752a15727dc09ac5ed3d8ee5805ed816e06b432699aea1244c36a3fe478c22d"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.652415 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" event={"ID":"92721dbb-2c2a-448a-801f-579a9d2d9566","Type":"ContainerStarted","Data":"129d3973a4535d17a347d77e09e8866a9f759fd9cfba9c162f4b59d5c3e5a961"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.664963 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" event={"ID":"8214555c-5d28-43a6-8033-afe1e5a16c54","Type":"ContainerStarted","Data":"c3992aed8224d164fe8bc3efd3afe91ce5f850379c005038670d83fc90a276ca"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.665930 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" event={"ID":"127a885b-d7f5-47ed-890d-159a75a7f79e","Type":"ContainerStarted","Data":"f633db8ffc836cfae2eb3244e0e34e148f670f5b5026c4497ba6bec0e58ec5ae"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.671590 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" event={"ID":"6a145826-4023-4211-aa90-aedba31d17c1","Type":"ContainerStarted","Data":"f4e58d6f14105b145f9147f255e2896f4d1d2e3ec67b44d86027c62fde8b8492"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.675428 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" event={"ID":"f8285777-1554-41ed-8fef-daf8637a4c5d","Type":"ContainerStarted","Data":"1772af3d46030daf4da1c6abda163756844d1a89180cee2b8833c80add139287"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.686703 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" event={"ID":"430d304c-8623-4d01-a878-5db061d6a5b8","Type":"ContainerStarted","Data":"d9a00f77a29ed0b768af3ba4dbc05c2b18ada066d4c6e1881b6308ea408c06ca"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.686741 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" event={"ID":"430d304c-8623-4d01-a878-5db061d6a5b8","Type":"ContainerStarted","Data":"6863f3ae6615f3ee645173a9a18e865a5da82aec7a20d7156ead0ffb4357fc2e"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.687066 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.688469 4937 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-29fxd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.688518 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" podUID="430d304c-8623-4d01-a878-5db061d6a5b8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.700829 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:47 crc kubenswrapper[4937]: E0225 15:49:47.701296 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:48.201276922 +0000 UTC m=+239.214668882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.712656 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" event={"ID":"7210df16-765e-4b49-8b67-8989f4b2f15c","Type":"ContainerStarted","Data":"63bb65614bbe3552a528fa7ed1d946d23cdeefccc02b35684f7fb3e8da6e8797"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.806036 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:47 crc kubenswrapper[4937]: E0225 15:49:47.806137 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:48.306117399 +0000 UTC m=+239.319509299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.806604 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:47 crc kubenswrapper[4937]: E0225 15:49:47.808084 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:48.308070636 +0000 UTC m=+239.321462616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.813151 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" event={"ID":"bf09db34-1df7-44a2-a584-a032476e4d66","Type":"ContainerStarted","Data":"9126f0d74fec251ab5643401a4bcc9193542accf0c4b0d5fe602920197cbef3a"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.819930 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vd8vf" event={"ID":"d9c49432-4c74-4842-bdd2-880414a4ad0a","Type":"ContainerStarted","Data":"62241516ba0d2105e1fb63fb100bffdfaeb08da8d9ad52f659ce970776cdd73a"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.823478 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.837452 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5" event={"ID":"bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13","Type":"ContainerStarted","Data":"1565d8054aadd7af981ecd3cca69d38cb69988018f397b70e60f314f1d5e1881"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.873842 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pplcq" event={"ID":"db1a84a9-f1b0-4dff-befd-796aebf7284b","Type":"ContainerStarted","Data":"c134b62249d87b36a2dfec388a7a2d9de6633499d664071d876ff3d4faf3f034"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.885062 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds" event={"ID":"cc560d1e-8ebe-4c2d-8597-21407baf4406","Type":"ContainerStarted","Data":"e8671792111626b1b1826d12a0404b72711aab1dd77106ac785c8dff684d8937"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.896541 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.900112 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.916336 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:47 crc kubenswrapper[4937]: E0225 15:49:47.916967 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:48.416947901 +0000 UTC m=+239.430339791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.930314 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg" event={"ID":"dc52d338-32a0-4072-8b02-578a41f8b3bc","Type":"ContainerStarted","Data":"43c5042c73cc83009413f857105d44fe6dcd17a2463eed7b7389589038f8bc2f"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.936816 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" event={"ID":"7734eeb2-8011-4c7d-9614-e63f8d93b189","Type":"ContainerStarted","Data":"fab599a09bff5eb965b1e04eb9b50ad6f77b6d4c52549e93097f33a983008499"} Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.940259 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.973353 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533908-nq4vc"] Feb 25 15:49:47 crc kubenswrapper[4937]: I0225 15:49:47.975687 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-l66s9"] Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.003571 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd" event={"ID":"33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d","Type":"ContainerStarted","Data":"06baf35c617e682c0c5a2f263d8feefdf9a4d44b8e226ebde3c07bf5cbef825b"} Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.012672 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dxb6p"] Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.016132 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb" event={"ID":"18ba8725-4b8c-4dcb-b0d9-3d07364d5c30","Type":"ContainerStarted","Data":"836c3c4185c6de540e1090fea19137ae542d97d77f844b4a919e3c5f08a91dfc"} Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.018719 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.019150 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:48.519136774 +0000 UTC m=+239.532528664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.021464 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-57xqt" event={"ID":"a8883398-bb74-4223-bde1-bd53e899926a","Type":"ContainerStarted","Data":"b5d66e3cab485fd2702b3184d2b11ae0b8558622f4b84f1e8ee3c96768296c6b"} Feb 25 15:49:48 crc kubenswrapper[4937]: W0225 15:49:48.035571 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd38451a_2713_4257_ac27_d6f304d6c0fc.slice/crio-057509b694d14b3d07bc5c2398f8a8f81830726c65391344320ff48a8ef8b3f7 WatchSource:0}: Error finding container 057509b694d14b3d07bc5c2398f8a8f81830726c65391344320ff48a8ef8b3f7: Status 404 returned error can't find the container with id 057509b694d14b3d07bc5c2398f8a8f81830726c65391344320ff48a8ef8b3f7 Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.038187 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-znkpp" event={"ID":"06ec1775-ce0a-4a78-b4ea-75de7a931917","Type":"ContainerStarted","Data":"c7f651f712c7dabc50cdb0a31979537a517f4436fc38f5b70d4c5844ab7b7a00"} Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.041318 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7drd4" event={"ID":"0f9fc900-6cf7-4890-8e7d-6925e9e3862f","Type":"ContainerStarted","Data":"d1dc99a1f915c55be44b07a372c11749fcf00cd4411018f84df79cfe69fa4521"} Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.043312 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" event={"ID":"b2a49754-862e-459c-bc50-f1b1b67cb4ea","Type":"ContainerStarted","Data":"e23d0a5c207e1e405b510bfc60b4010e7726979f75f7214a3abaa748ffc8154f"} Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.047455 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-djs85" event={"ID":"ff089f24-3d05-4c97-b6f7-3a39cbec049f","Type":"ContainerStarted","Data":"78701e0723c954f51d37416168268396db1d90ff2146a8f0702433550f339d45"} Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.050537 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" event={"ID":"b771f4d8-8253-4530-9e1a-e0ca06f263e4","Type":"ContainerStarted","Data":"7406cd0e397a27f805d395d42c9fa2f7a8cb8af0e5493d86adbd96097a145c22"} Feb 25 15:49:48 crc kubenswrapper[4937]: W0225 15:49:48.051700 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b1dc13b_9b02_42b0_a00e_21f15f9f98a2.slice/crio-264cea18e17c7bb5d4ea67a3b9809e96f893c8aec76d5e042f778641b6bc6e66 WatchSource:0}: Error finding container 264cea18e17c7bb5d4ea67a3b9809e96f893c8aec76d5e042f778641b6bc6e66: Status 404 returned error can't find the container with id 264cea18e17c7bb5d4ea67a3b9809e96f893c8aec76d5e042f778641b6bc6e66 Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.094212 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-48pnj"] Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.119202 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.119283 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:48.619261927 +0000 UTC m=+239.632653807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.124493 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.125775 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:48.625743574 +0000 UTC m=+239.639135464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:48 crc kubenswrapper[4937]: W0225 15:49:48.135747 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod816999fe_cb2a_4f9b_b546_ca866b5aec3a.slice/crio-f3960750375bcf5e491c414a1c785da2828e6347e3d049d4b2ab8418fef1acd1 WatchSource:0}: Error finding container f3960750375bcf5e491c414a1c785da2828e6347e3d049d4b2ab8418fef1acd1: Status 404 returned error can't find the container with id f3960750375bcf5e491c414a1c785da2828e6347e3d049d4b2ab8418fef1acd1 Feb 25 15:49:48 crc kubenswrapper[4937]: W0225 15:49:48.143468 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c63f82a_9346_476b_ae17_edb260b2a36f.slice/crio-153d2a13710035e0387f4054ac624b53d6df4499ed57d91be0f0217848cd6dfd WatchSource:0}: Error finding container 153d2a13710035e0387f4054ac624b53d6df4499ed57d91be0f0217848cd6dfd: Status 404 returned error can't find the container with id 153d2a13710035e0387f4054ac624b53d6df4499ed57d91be0f0217848cd6dfd Feb 25 15:49:48 crc kubenswrapper[4937]: W0225 15:49:48.146083 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42f414e1_fd0d_4e08_8783_f68ae63af8c8.slice/crio-8f43064359788dd57419b98a8db888da7828fbf1baa04ec3ccdb005eea76a1c5 WatchSource:0}: Error finding container 8f43064359788dd57419b98a8db888da7828fbf1baa04ec3ccdb005eea76a1c5: Status 404 returned error can't find the container with id 8f43064359788dd57419b98a8db888da7828fbf1baa04ec3ccdb005eea76a1c5 Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.209784 4937 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.230922 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.231591 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:48.731574915 +0000 UTC m=+239.744966805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.233307 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk"] Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.302296 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r5bpn"] Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.332733 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.333067 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:48.833054831 +0000 UTC m=+239.846446721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.360239 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw"] Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.364509 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4"] Feb 25 15:49:48 crc kubenswrapper[4937]: W0225 15:49:48.370442 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod701d18f5_f4eb_49fa_92b8_7ef5fcc00ce3.slice/crio-c039241bea17b56f29afdb1903ad1ac0c7f126280c27d4a37dd733a157be1852 WatchSource:0}: Error finding container c039241bea17b56f29afdb1903ad1ac0c7f126280c27d4a37dd733a157be1852: Status 404 returned error can't find the container with id c039241bea17b56f29afdb1903ad1ac0c7f126280c27d4a37dd733a157be1852 Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.414969 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9nd26"] Feb 25 15:49:48 crc kubenswrapper[4937]: W0225 15:49:48.428312 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62b20b63_dccd_4dd2_94c5_c89bf4d87585.slice/crio-0b562b20e63a545c43a3610813e2cdb70d3291b047bd182e648076dd94356fa4 WatchSource:0}: Error finding container 0b562b20e63a545c43a3610813e2cdb70d3291b047bd182e648076dd94356fa4: Status 404 returned error can't find the container with id 0b562b20e63a545c43a3610813e2cdb70d3291b047bd182e648076dd94356fa4 Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.433772 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.433933 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:48.933907691 +0000 UTC m=+239.947299581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.434061 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.434411 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:48.934404243 +0000 UTC m=+239.947796133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:48 crc kubenswrapper[4937]: W0225 15:49:48.450719 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3b2c333_3db5_4de3_bcc1_944dfc35b2b3.slice/crio-b861db428d3a67036ef53124ec0ad9b95bf73b7d62362fd3d0fa25a628c2a61a WatchSource:0}: Error finding container b861db428d3a67036ef53124ec0ad9b95bf73b7d62362fd3d0fa25a628c2a61a: Status 404 returned error can't find the container with id b861db428d3a67036ef53124ec0ad9b95bf73b7d62362fd3d0fa25a628c2a61a Feb 25 15:49:48 crc kubenswrapper[4937]: W0225 15:49:48.451595 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod727110e8_1674_467e_b39a_0fca0b874523.slice/crio-5d0b25cc88a0ed675c6fc02cf07eff9dd2de4fda761a39f79f216e5dc79dcbe9 WatchSource:0}: Error finding container 5d0b25cc88a0ed675c6fc02cf07eff9dd2de4fda761a39f79f216e5dc79dcbe9: Status 404 returned error can't find the container with id 5d0b25cc88a0ed675c6fc02cf07eff9dd2de4fda761a39f79f216e5dc79dcbe9 Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.536066 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.536959 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.036942095 +0000 UTC m=+240.050333985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.605753 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-djs85" podStartSLOduration=187.605726859 podStartE2EDuration="3m7.605726859s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:48.60163561 +0000 UTC m=+239.615027500" watchObservedRunningTime="2026-02-25 15:49:48.605726859 +0000 UTC m=+239.619118749" Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.638874 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" podStartSLOduration=187.638849911 podStartE2EDuration="3m7.638849911s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:48.634198368 +0000 UTC m=+239.647590278" watchObservedRunningTime="2026-02-25 15:49:48.638849911 +0000 UTC m=+239.652241801" Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.638949 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.639748 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.139731742 +0000 UTC m=+240.153123632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.740061 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.740210 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.240185963 +0000 UTC m=+240.253577853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.740787 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.741157 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.241146246 +0000 UTC m=+240.254538216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.842041 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.842203 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.342180611 +0000 UTC m=+240.355572501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.842356 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.842805 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.342790556 +0000 UTC m=+240.356182446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.942919 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.943063 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.443045842 +0000 UTC m=+240.456437732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:48 crc kubenswrapper[4937]: I0225 15:49:48.943244 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:48 crc kubenswrapper[4937]: E0225 15:49:48.943769 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.443762329 +0000 UTC m=+240.457154219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.044023 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:49 crc kubenswrapper[4937]: E0225 15:49:49.044181 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.544157999 +0000 UTC m=+240.557549889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.044244 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:49 crc kubenswrapper[4937]: E0225 15:49:49.044618 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.54460999 +0000 UTC m=+240.558001880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.066891 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" event={"ID":"a22f23f6-fdc2-4842-bd45-3dd5695c48c6","Type":"ContainerStarted","Data":"c1af78efc04529f8c4601783958383ee269667464f2f0f8562f427b6bc1c49d6"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.068651 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5" event={"ID":"bbf8eafe-4296-4ab8-9a4c-5c6051ff2b13","Type":"ContainerStarted","Data":"92352be5612c47d769f006df8d10876fecab65978e9874606fbb25cf0772a0bd"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.070420 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-znkpp" event={"ID":"06ec1775-ce0a-4a78-b4ea-75de7a931917","Type":"ContainerStarted","Data":"2d98973248f8ef5188f7c46233d849a8f3642bd4647afe7684318e009a9060ee"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.070665 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.073984 4937 patch_prober.go:28] interesting pod/console-operator-58897d9998-znkpp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.074035 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-znkpp" podUID="06ec1775-ce0a-4a78-b4ea-75de7a931917" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.074433 4937 generic.go:334] "Generic (PLEG): container finished" podID="0f9fc900-6cf7-4890-8e7d-6925e9e3862f" containerID="cf6c93abbff9f938b5a98cc13f016f023c979ee47213e63c6cc2fd32ae8c2707" exitCode=0 Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.074507 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7drd4" event={"ID":"0f9fc900-6cf7-4890-8e7d-6925e9e3862f","Type":"ContainerDied","Data":"cf6c93abbff9f938b5a98cc13f016f023c979ee47213e63c6cc2fd32ae8c2707"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.075900 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" event={"ID":"49fa31be-2461-4113-96b3-a1da363827c7","Type":"ContainerStarted","Data":"0c64742297388690fe882d007f3f817d41b5ee81872e04458a091007a46fbd1c"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.075939 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" event={"ID":"49fa31be-2461-4113-96b3-a1da363827c7","Type":"ContainerStarted","Data":"0f2b9d50f1b40a0c86833a42a8b1d50b052814425877c05ed3551276bdc023c6"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.084979 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rnsf5" podStartSLOduration=187.084960926 podStartE2EDuration="3m7.084960926s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:49.082801544 +0000 UTC m=+240.096193434" watchObservedRunningTime="2026-02-25 15:49:49.084960926 +0000 UTC m=+240.098352816" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.088782 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" event={"ID":"b771f4d8-8253-4530-9e1a-e0ca06f263e4","Type":"ContainerStarted","Data":"096fd1ba256aac0190d0c8386c67d9fcdbf3dd3fd696df3f74a8483ea439fa94"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.090757 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" event={"ID":"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3","Type":"ContainerStarted","Data":"b861db428d3a67036ef53124ec0ad9b95bf73b7d62362fd3d0fa25a628c2a61a"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.092930 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" event={"ID":"127a885b-d7f5-47ed-890d-159a75a7f79e","Type":"ContainerStarted","Data":"0bb7fdf44a887ff13fdc4b9bd4c32b53167b789971814b280cef92181fdcdea0"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.095040 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" event={"ID":"b2a49754-862e-459c-bc50-f1b1b67cb4ea","Type":"ContainerStarted","Data":"592ec0e6aa89b74baacb2ee98d6155b5a78d6b95e9ae5ca7240bfaac9d352568"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.096937 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk" event={"ID":"701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3","Type":"ContainerStarted","Data":"c039241bea17b56f29afdb1903ad1ac0c7f126280c27d4a37dd733a157be1852"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.097913 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dxb6p" event={"ID":"cd38451a-2713-4257-ac27-d6f304d6c0fc","Type":"ContainerStarted","Data":"057509b694d14b3d07bc5c2398f8a8f81830726c65391344320ff48a8ef8b3f7"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.100991 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-znkpp" podStartSLOduration=188.100977154 podStartE2EDuration="3m8.100977154s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:49.099771765 +0000 UTC m=+240.113163665" watchObservedRunningTime="2026-02-25 15:49:49.100977154 +0000 UTC m=+240.114369044" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.101931 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" event={"ID":"6a145826-4023-4211-aa90-aedba31d17c1","Type":"ContainerStarted","Data":"84cc9893d6aba5a2fca1959fe45c368ec777f3735bf80bb1ae8b72cc6a23f627"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.102911 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.103894 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" event={"ID":"7734eeb2-8011-4c7d-9614-e63f8d93b189","Type":"ContainerStarted","Data":"58a74b9e91e3928fd903fe388f0c9ba01fe96908d1bfa6fe9cc0161af0a33ba6"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.106533 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" event={"ID":"7210df16-765e-4b49-8b67-8989f4b2f15c","Type":"ContainerStarted","Data":"65e50546ff2bff94b540c5e5ef768308dbdaa6e6e58764b6df257de7e7335d77"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.108649 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785" event={"ID":"9b1dc13b-9b02-42b0-a00e-21f15f9f98a2","Type":"ContainerStarted","Data":"264cea18e17c7bb5d4ea67a3b9809e96f893c8aec76d5e042f778641b6bc6e66"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.119075 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fnvmt" podStartSLOduration=188.119053511 podStartE2EDuration="3m8.119053511s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:49.116988911 +0000 UTC m=+240.130380791" watchObservedRunningTime="2026-02-25 15:49:49.119053511 +0000 UTC m=+240.132445401" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.120769 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd" event={"ID":"33d6c5ad-ffa3-45f3-84e9-10b72bb10e5d","Type":"ContainerStarted","Data":"30ef6832cca9f98902353ea4dc1dc15f280487133b9e4cd92812dfa8670f9e24"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.125685 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vd8vf" event={"ID":"d9c49432-4c74-4842-bdd2-880414a4ad0a","Type":"ContainerStarted","Data":"dae24351144e2f191f51ee803738b86bb804b445acd2efd21fb0d1ae41d66e7f"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.125824 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-vd8vf" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.126895 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp" event={"ID":"54bfd5df-332a-44ff-9da6-5a2f6a29083c","Type":"ContainerStarted","Data":"070beca50d89c0c6e2036323d99952d6e32962eaaa69dee9febd27fed338c3d6"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.138630 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" podStartSLOduration=188.138607625 podStartE2EDuration="3m8.138607625s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:49.138344658 +0000 UTC m=+240.151736538" watchObservedRunningTime="2026-02-25 15:49:49.138607625 +0000 UTC m=+240.151999515" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.141125 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.141189 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.141538 4937 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6r6g7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.36:6443/healthz\": dial tcp 10.217.0.36:6443: connect: connection refused" start-of-body= Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.141679 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" podUID="6a145826-4023-4211-aa90-aedba31d17c1" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.36:6443/healthz\": dial tcp 10.217.0.36:6443: connect: connection refused" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.142745 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" event={"ID":"8214555c-5d28-43a6-8033-afe1e5a16c54","Type":"ContainerStarted","Data":"3441b7b197959b2ac4fc27dd19d96e96b940c7fafa3c5b798fbf585127c7e0f6"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.145458 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:49 crc kubenswrapper[4937]: E0225 15:49:49.145592 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.645576173 +0000 UTC m=+240.658968063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.145833 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.145866 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" event={"ID":"bf09db34-1df7-44a2-a584-a032476e4d66","Type":"ContainerStarted","Data":"8ec9fc667000c543a11830f5cf34e7f3de85b13999fd38ea9a7d8b92f9d035ca"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.146548 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:49 crc kubenswrapper[4937]: E0225 15:49:49.148630 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.648607267 +0000 UTC m=+240.661999147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.149619 4937 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-lgfgx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.150149 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" podUID="bf09db34-1df7-44a2-a584-a032476e4d66" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.151045 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268" event={"ID":"8959ab96-5694-4631-b3f5-bfcb7213a21d","Type":"ContainerStarted","Data":"0f5a54928fe7ac1ca14047c4a8d935578060182e958c66e4486aaf4b0221fd0b"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.152757 4937 generic.go:334] "Generic (PLEG): container finished" podID="92721dbb-2c2a-448a-801f-579a9d2d9566" containerID="5812a7f8d67f482bcbc9930ff6a414a5cf3f77b8e4cc666bded3833bd4c74fb1" exitCode=0 Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.152814 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" event={"ID":"92721dbb-2c2a-448a-801f-579a9d2d9566","Type":"ContainerDied","Data":"5812a7f8d67f482bcbc9930ff6a414a5cf3f77b8e4cc666bded3833bd4c74fb1"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.155570 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pplcq" event={"ID":"db1a84a9-f1b0-4dff-befd-796aebf7284b","Type":"ContainerStarted","Data":"7ba09cf623751481bc0e6fc56ba29bbecc14282bd3dfbbc346a446202e6b9a56"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.156755 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjfbd" podStartSLOduration=188.156744693 podStartE2EDuration="3m8.156744693s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:49.156453676 +0000 UTC m=+240.169845566" watchObservedRunningTime="2026-02-25 15:49:49.156744693 +0000 UTC m=+240.170136583" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.159237 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-48pnj" event={"ID":"816999fe-cb2a-4f9b-b546-ca866b5aec3a","Type":"ContainerStarted","Data":"f3960750375bcf5e491c414a1c785da2828e6347e3d049d4b2ab8418fef1acd1"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.161208 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-q2wd7" event={"ID":"e21358c1-fad3-42c2-982d-8f3e50fadc34","Type":"ContainerStarted","Data":"3f388037e5982718b8bfcc47cf27e5fe9c74013e77c0a3b8a689139f5905054d"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.162315 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-b4jvp" event={"ID":"dba71048-faea-4ee2-bec3-70c2fa66a7e8","Type":"ContainerStarted","Data":"608fabf38ce5e640ed90b14508e07cbd833ab1d8c8f2a3bcdcc076c3d4ec42af"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.167741 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg" event={"ID":"dc52d338-32a0-4072-8b02-578a41f8b3bc","Type":"ContainerStarted","Data":"9a3a6ad046711ba0f17e42432ec06be8d9b1693c2ecf7008d159e6a02e93e341"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.194829 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" podStartSLOduration=188.194793234 podStartE2EDuration="3m8.194793234s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:49.184189488 +0000 UTC m=+240.197581368" watchObservedRunningTime="2026-02-25 15:49:49.194793234 +0000 UTC m=+240.208185124" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.197051 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l66s9" event={"ID":"42f414e1-fd0d-4e08-8783-f68ae63af8c8","Type":"ContainerStarted","Data":"8f43064359788dd57419b98a8db888da7828fbf1baa04ec3ccdb005eea76a1c5"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.204260 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" event={"ID":"727110e8-1674-467e-b39a-0fca0b874523","Type":"ContainerStarted","Data":"5d0b25cc88a0ed675c6fc02cf07eff9dd2de4fda761a39f79f216e5dc79dcbe9"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.205747 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-vd8vf" podStartSLOduration=188.205725619 podStartE2EDuration="3m8.205725619s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:49.202022539 +0000 UTC m=+240.215414449" watchObservedRunningTime="2026-02-25 15:49:49.205725619 +0000 UTC m=+240.219117529" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.208130 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" event={"ID":"d9c7aa27-d268-45e4-be93-97de8f8dfb8c","Type":"ContainerStarted","Data":"3ecc89b884fe25207b06755b3092cbdaa487e6bb30ea5f73f567c7e73d03422c"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.209529 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-57xqt" event={"ID":"a8883398-bb74-4223-bde1-bd53e899926a","Type":"ContainerStarted","Data":"67f1634581e0c3dbaea0497e10e1f2a6478b641e4a5494d4a970de55c4a924d5"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.213426 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb" event={"ID":"18ba8725-4b8c-4dcb-b0d9-3d07364d5c30","Type":"ContainerStarted","Data":"156c88ca6fbe0b465cb8269c7086e892526c9e37d3bdabe7c97b8c60e14d1e8a"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.215070 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533908-nq4vc" event={"ID":"0c63f82a-9346-476b-ae17-edb260b2a36f","Type":"ContainerStarted","Data":"153d2a13710035e0387f4054ac624b53d6df4499ed57d91be0f0217848cd6dfd"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.216163 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9nd26" event={"ID":"62b20b63-dccd-4dd2-94c5-c89bf4d87585","Type":"ContainerStarted","Data":"0b562b20e63a545c43a3610813e2cdb70d3291b047bd182e648076dd94356fa4"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.217361 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" event={"ID":"906509ff-be49-4c28-95b5-9f80cb885ece","Type":"ContainerStarted","Data":"0a3340ad207d5a69041d7085021f5b12193a326e7ab9a63db22cad6f744722a1"} Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.218449 4937 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-29fxd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.218519 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" podUID="430d304c-8623-4d01-a878-5db061d6a5b8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.247369 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:49 crc kubenswrapper[4937]: E0225 15:49:49.248824 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.748777881 +0000 UTC m=+240.762169771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.257121 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" podStartSLOduration=187.257093952 podStartE2EDuration="3m7.257093952s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:49.222275909 +0000 UTC m=+240.235667799" watchObservedRunningTime="2026-02-25 15:49:49.257093952 +0000 UTC m=+240.270485852" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.269869 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-pplcq" podStartSLOduration=6.26984519 podStartE2EDuration="6.26984519s" podCreationTimestamp="2026-02-25 15:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:49.269233086 +0000 UTC m=+240.282624976" watchObservedRunningTime="2026-02-25 15:49:49.26984519 +0000 UTC m=+240.283237080" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.292693 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" podStartSLOduration=188.292674103 podStartE2EDuration="3m8.292674103s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:49.290182973 +0000 UTC m=+240.303574873" watchObservedRunningTime="2026-02-25 15:49:49.292674103 +0000 UTC m=+240.306065993" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.303395 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-57xqt" podStartSLOduration=187.303376802 podStartE2EDuration="3m7.303376802s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:49.302553032 +0000 UTC m=+240.315944922" watchObservedRunningTime="2026-02-25 15:49:49.303376802 +0000 UTC m=+240.316768692" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.321911 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mhzvb" podStartSLOduration=188.32188859 podStartE2EDuration="3m8.32188859s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:49.321018609 +0000 UTC m=+240.334410499" watchObservedRunningTime="2026-02-25 15:49:49.32188859 +0000 UTC m=+240.335280480" Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.355140 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:49 crc kubenswrapper[4937]: E0225 15:49:49.355607 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.855593556 +0000 UTC m=+240.868985446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.457720 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:49 crc kubenswrapper[4937]: E0225 15:49:49.458191 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:49.958176678 +0000 UTC m=+240.971568568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.559743 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:49 crc kubenswrapper[4937]: E0225 15:49:49.560592 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:50.060575556 +0000 UTC m=+241.073967446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.661409 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:49 crc kubenswrapper[4937]: E0225 15:49:49.661626 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:50.16158563 +0000 UTC m=+241.174977520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.661853 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:49 crc kubenswrapper[4937]: E0225 15:49:49.662313 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:50.162304838 +0000 UTC m=+241.175696728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.762705 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:49 crc kubenswrapper[4937]: E0225 15:49:49.762859 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:50.26283464 +0000 UTC m=+241.276226540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.763518 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:49 crc kubenswrapper[4937]: E0225 15:49:49.764042 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:50.264020029 +0000 UTC m=+241.277411919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.864914 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:49 crc kubenswrapper[4937]: E0225 15:49:49.865973 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:50.365958626 +0000 UTC m=+241.379350516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:49 crc kubenswrapper[4937]: I0225 15:49:49.967083 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:49 crc kubenswrapper[4937]: E0225 15:49:49.967846 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:50.467834081 +0000 UTC m=+241.481225971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.068567 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:50 crc kubenswrapper[4937]: E0225 15:49:50.068789 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:50.568759264 +0000 UTC m=+241.582151154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.069199 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:50 crc kubenswrapper[4937]: E0225 15:49:50.069571 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:50.569561833 +0000 UTC m=+241.582953723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.170452 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:50 crc kubenswrapper[4937]: E0225 15:49:50.170647 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:50.670616509 +0000 UTC m=+241.684008399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.170803 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:50 crc kubenswrapper[4937]: E0225 15:49:50.171225 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:50.671217203 +0000 UTC m=+241.684609093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.225208 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" event={"ID":"d9c7aa27-d268-45e4-be93-97de8f8dfb8c","Type":"ContainerStarted","Data":"782952b2995654943c37947f4bb0bf09c72f3c199536d5255f8bce548707d650"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.225536 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.227232 4937 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-fzfv4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.227279 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" podUID="d9c7aa27-d268-45e4-be93-97de8f8dfb8c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.228040 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" event={"ID":"7210df16-765e-4b49-8b67-8989f4b2f15c","Type":"ContainerStarted","Data":"de2c55062270fe79b7297ee5bd2d8cfceea8ba965cfb74fcf5086f1a70abdb3d"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.230558 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785" event={"ID":"9b1dc13b-9b02-42b0-a00e-21f15f9f98a2","Type":"ContainerStarted","Data":"15083cf85fd2d5656ed8d2f192c65a6a2556f18991b2db9af4b8124c531c08a8"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.232215 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268" event={"ID":"8959ab96-5694-4631-b3f5-bfcb7213a21d","Type":"ContainerStarted","Data":"1fcc3d4676883d8614427df3cf733d90b9f71500b0f5869a4e3408f84d681f68"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.234961 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" event={"ID":"92721dbb-2c2a-448a-801f-579a9d2d9566","Type":"ContainerStarted","Data":"78f22b32dfe2353dd4f611033b80a6aaa2cf4ccc4241f418a4cf175262c29d42"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.235069 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.237172 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp" event={"ID":"54bfd5df-332a-44ff-9da6-5a2f6a29083c","Type":"ContainerStarted","Data":"0d90ea8556083ce2e926ef34b87013f9ca048d3bfdb6d786fb7d8139389d391e"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.238768 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dxb6p" event={"ID":"cd38451a-2713-4257-ac27-d6f304d6c0fc","Type":"ContainerStarted","Data":"99656934e95fb9c43f543d4eff0980a7a33144fc662dc8d0d3319007819d542a"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.240094 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" event={"ID":"727110e8-1674-467e-b39a-0fca0b874523","Type":"ContainerStarted","Data":"e8e601b9f2b8dce1055492fad3188e6680939bcc2afd5f80ba4a52dd5e7959ae"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.240552 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.242059 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg" event={"ID":"dc52d338-32a0-4072-8b02-578a41f8b3bc","Type":"ContainerStarted","Data":"0052afb4d480fe90434b261d096656c8c57216a195a45e8a1857323af2de53f5"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.242759 4937 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hhvn4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.242800 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" podUID="727110e8-1674-467e-b39a-0fca0b874523" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.245942 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" podStartSLOduration=188.245931511 podStartE2EDuration="3m8.245931511s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.245826329 +0000 UTC m=+241.259218219" watchObservedRunningTime="2026-02-25 15:49:50.245931511 +0000 UTC m=+241.259323401" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.248098 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" event={"ID":"a22f23f6-fdc2-4842-bd45-3dd5695c48c6","Type":"ContainerStarted","Data":"4cc901967c937c58925c178b015787d16bd25c686792b6bc85acb407fd952e12"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.248218 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" event={"ID":"a22f23f6-fdc2-4842-bd45-3dd5695c48c6","Type":"ContainerStarted","Data":"bdb067d724a30a199e1df22a4edece6ae560d5a265afeaf9368d1fb0c9c268e1"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.251465 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" event={"ID":"aa374000-00c7-43a6-b3a4-1ced809e17e9","Type":"ContainerStarted","Data":"768f7951d523f90eed131d962241bfbda6f66e23a23aeaa95c92606d9ec751d4"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.253935 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" event={"ID":"8214555c-5d28-43a6-8033-afe1e5a16c54","Type":"ContainerStarted","Data":"490b6bf122b5030d4288d013fffb90de80035f066c13d857e4cccad9c6ad03d3"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.256109 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9nd26" event={"ID":"62b20b63-dccd-4dd2-94c5-c89bf4d87585","Type":"ContainerStarted","Data":"cf8b5086aa738f6d2526d2f52e0fcfefb93c821f19b4b570202a16c52fe89192"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.257612 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" event={"ID":"906509ff-be49-4c28-95b5-9f80cb885ece","Type":"ContainerStarted","Data":"d55d9613d6b79dc417482e2442fd54315d5e7e3d9d531b239fde28469b537c11"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.257911 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.259321 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk" event={"ID":"701d18f5-f4eb-49fa-92b8-7ef5fcc00ce3","Type":"ContainerStarted","Data":"5487b72c20c4db76cedbdb9346845036940353fc01929b49609e851ced11b040"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.262789 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds" event={"ID":"cc560d1e-8ebe-4c2d-8597-21407baf4406","Type":"ContainerStarted","Data":"0399c99e80158e9d31e3ccb2f46938fc62c584525496a264f2f5e1d72eaba2f7"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.264108 4937 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r5bpn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.264215 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.265801 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" event={"ID":"127a885b-d7f5-47ed-890d-159a75a7f79e","Type":"ContainerStarted","Data":"00888f2f6e2a635d34e44b8c0474ebbb18db6c8291095a5a1145d87fae14c6db"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.270250 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4jjbg" podStartSLOduration=189.270228869 podStartE2EDuration="3m9.270228869s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.269636245 +0000 UTC m=+241.283028135" watchObservedRunningTime="2026-02-25 15:49:50.270228869 +0000 UTC m=+241.283620759" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.271778 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:50 crc kubenswrapper[4937]: E0225 15:49:50.273129 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:50.773087478 +0000 UTC m=+241.786479368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.283214 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" event={"ID":"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3","Type":"ContainerStarted","Data":"48d01a04fa661c299849ab0a5e5e6fdd42337fce0475e6e6a26fe17d302c978d"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.288225 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-q2wd7" event={"ID":"e21358c1-fad3-42c2-982d-8f3e50fadc34","Type":"ContainerStarted","Data":"ec534da8edf00f6aa0790a65fd26acdc5abe8393bbb990585e2d76d0e752f384"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.298427 4937 generic.go:334] "Generic (PLEG): container finished" podID="f8285777-1554-41ed-8fef-daf8637a4c5d" containerID="8407c06240b373f82d92dfcfeebb87df286c8bf7e5758337e7425630a2146a12" exitCode=0 Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.299790 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" event={"ID":"f8285777-1554-41ed-8fef-daf8637a4c5d","Type":"ContainerDied","Data":"8407c06240b373f82d92dfcfeebb87df286c8bf7e5758337e7425630a2146a12"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.316903 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785" podStartSLOduration=188.316879058 podStartE2EDuration="3m8.316879058s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.315845453 +0000 UTC m=+241.329237343" watchObservedRunningTime="2026-02-25 15:49:50.316879058 +0000 UTC m=+241.330270948" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.328646 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l66s9" event={"ID":"42f414e1-fd0d-4e08-8783-f68ae63af8c8","Type":"ContainerStarted","Data":"741d1a2f98213036ae9043d245b0ad15db893f6e83475e2f5fc9311e54233866"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.350795 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" podStartSLOduration=189.350777738 podStartE2EDuration="3m9.350777738s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.348739759 +0000 UTC m=+241.362131649" watchObservedRunningTime="2026-02-25 15:49:50.350777738 +0000 UTC m=+241.364169628" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.353787 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-b4jvp" event={"ID":"dba71048-faea-4ee2-bec3-70c2fa66a7e8","Type":"ContainerStarted","Data":"0161b0c070250aeb959ffd7c4f186ccefa1b47d681f8d14de54d41688887195f"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.366153 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" event={"ID":"7734eeb2-8011-4c7d-9614-e63f8d93b189","Type":"ContainerStarted","Data":"fcc82a380d4a7a9f2fb15985e78da75b18a5e89459b18095021a50fe91e461ec"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.367117 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.375606 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dt5kp" podStartSLOduration=188.375583639 podStartE2EDuration="3m8.375583639s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.371968361 +0000 UTC m=+241.385360251" watchObservedRunningTime="2026-02-25 15:49:50.375583639 +0000 UTC m=+241.388975529" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.376652 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:50 crc kubenswrapper[4937]: E0225 15:49:50.379823 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:50.879809191 +0000 UTC m=+241.893201081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.388918 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qfghw" event={"ID":"d33e6a6a-98b5-4eb8-8de5-8138395b48cb","Type":"ContainerStarted","Data":"e80ee6801b041516b9a30fd19d39f7ef1547ef492a0216f6f7495fb73fbab9d2"} Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.390738 4937 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-lgfgx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.390782 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" podUID="bf09db34-1df7-44a2-a584-a032476e4d66" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.400709 4937 patch_prober.go:28] interesting pod/console-operator-58897d9998-znkpp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.400867 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-znkpp" podUID="06ec1775-ce0a-4a78-b4ea-75de7a931917" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.401262 4937 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-crxdn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.401339 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" podUID="49fa31be-2461-4113-96b3-a1da363827c7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.401973 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.402779 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.402813 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.410398 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dxb6p" podStartSLOduration=7.41037069 podStartE2EDuration="7.41037069s" podCreationTimestamp="2026-02-25 15:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.402143871 +0000 UTC m=+241.415535761" watchObservedRunningTime="2026-02-25 15:49:50.41037069 +0000 UTC m=+241.423762580" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.433904 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl268" podStartSLOduration=189.433877879 podStartE2EDuration="3m9.433877879s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.432617859 +0000 UTC m=+241.446009759" watchObservedRunningTime="2026-02-25 15:49:50.433877879 +0000 UTC m=+241.447269769" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.467047 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" podStartSLOduration=188.467025341 podStartE2EDuration="3m8.467025341s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.464224734 +0000 UTC m=+241.477616624" watchObservedRunningTime="2026-02-25 15:49:50.467025341 +0000 UTC m=+241.480417231" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.477423 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:50 crc kubenswrapper[4937]: E0225 15:49:50.479135 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:50.979114074 +0000 UTC m=+241.992505964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.500387 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-8zn9j" podStartSLOduration=188.500358628 podStartE2EDuration="3m8.500358628s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.495665014 +0000 UTC m=+241.509056894" watchObservedRunningTime="2026-02-25 15:49:50.500358628 +0000 UTC m=+241.513750528" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.520857 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mrcl" podStartSLOduration=188.520838724 podStartE2EDuration="3m8.520838724s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.520217489 +0000 UTC m=+241.533609379" watchObservedRunningTime="2026-02-25 15:49:50.520838724 +0000 UTC m=+241.534230614" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.592339 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" podStartSLOduration=189.592321704 podStartE2EDuration="3m9.592321704s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.558240139 +0000 UTC m=+241.571632029" watchObservedRunningTime="2026-02-25 15:49:50.592321704 +0000 UTC m=+241.605713594" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.594147 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:50 crc kubenswrapper[4937]: E0225 15:49:50.594517 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.094507126 +0000 UTC m=+242.107899016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.622807 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" podStartSLOduration=188.622788661 podStartE2EDuration="3m8.622788661s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.594107247 +0000 UTC m=+241.607499147" watchObservedRunningTime="2026-02-25 15:49:50.622788661 +0000 UTC m=+241.636180561" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.658016 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-qfghw" podStartSLOduration=189.658000123 podStartE2EDuration="3m9.658000123s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.655090443 +0000 UTC m=+241.668482333" watchObservedRunningTime="2026-02-25 15:49:50.658000123 +0000 UTC m=+241.671392013" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.658950 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-q2wd7" podStartSLOduration=189.658943706 podStartE2EDuration="3m9.658943706s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.62316095 +0000 UTC m=+241.636552850" watchObservedRunningTime="2026-02-25 15:49:50.658943706 +0000 UTC m=+241.672335596" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.678005 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-rcxdq" podStartSLOduration=189.677983867 podStartE2EDuration="3m9.677983867s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.677790902 +0000 UTC m=+241.691182812" watchObservedRunningTime="2026-02-25 15:49:50.677983867 +0000 UTC m=+241.691375757" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.695307 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:50 crc kubenswrapper[4937]: E0225 15:49:50.695499 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.195449779 +0000 UTC m=+242.208841679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.695692 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:50 crc kubenswrapper[4937]: E0225 15:49:50.696065 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.196052784 +0000 UTC m=+242.209444674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.705807 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gvtds" podStartSLOduration=189.705778059 podStartE2EDuration="3m9.705778059s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.703375041 +0000 UTC m=+241.716766961" watchObservedRunningTime="2026-02-25 15:49:50.705778059 +0000 UTC m=+241.719169949" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.756319 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gltdt" podStartSLOduration=189.756295722 podStartE2EDuration="3m9.756295722s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.752254404 +0000 UTC m=+241.765646304" watchObservedRunningTime="2026-02-25 15:49:50.756295722 +0000 UTC m=+241.769687612" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.796773 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:50 crc kubenswrapper[4937]: E0225 15:49:50.797242 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.297204772 +0000 UTC m=+242.310596662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.807512 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" podStartSLOduration=188.8074531 podStartE2EDuration="3m8.8074531s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.798352579 +0000 UTC m=+241.811744469" watchObservedRunningTime="2026-02-25 15:49:50.8074531 +0000 UTC m=+241.820844990" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.812848 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.831864 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" podStartSLOduration=188.83184821 podStartE2EDuration="3m8.83184821s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.830738703 +0000 UTC m=+241.844130593" watchObservedRunningTime="2026-02-25 15:49:50.83184821 +0000 UTC m=+241.845240100" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.897921 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:50 crc kubenswrapper[4937]: E0225 15:49:50.898276 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.398261527 +0000 UTC m=+242.411653407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.907202 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8g7sk" podStartSLOduration=189.907187463 podStartE2EDuration="3m9.907187463s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:50.88225947 +0000 UTC m=+241.895651370" watchObservedRunningTime="2026-02-25 15:49:50.907187463 +0000 UTC m=+241.920579353" Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.998680 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:50 crc kubenswrapper[4937]: E0225 15:49:50.998908 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.498876662 +0000 UTC m=+242.512268552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:50 crc kubenswrapper[4937]: I0225 15:49:50.998981 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:50 crc kubenswrapper[4937]: E0225 15:49:50.999314 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.499299402 +0000 UTC m=+242.512691292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.047552 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.049616 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.049708 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.100083 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.100258 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.600227065 +0000 UTC m=+242.613618955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.100402 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.100842 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.600835019 +0000 UTC m=+242.614226909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.200977 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.201146 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.701121895 +0000 UTC m=+242.714513785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.201391 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.201778 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.701764011 +0000 UTC m=+242.715155901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.304383 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.304623 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.804592079 +0000 UTC m=+242.817983969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.305061 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.305356 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.805344667 +0000 UTC m=+242.818736557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.406044 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.408197 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.908174836 +0000 UTC m=+242.921566726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.409201 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.409798 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:51.909777244 +0000 UTC m=+242.923169134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.430956 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" event={"ID":"f8285777-1554-41ed-8fef-daf8637a4c5d","Type":"ContainerStarted","Data":"4053d2aae159f93e5f11eed6be69f3f6c53f5c51d1b2d6bc68c6821ec128ab23"} Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.443144 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l66s9" event={"ID":"42f414e1-fd0d-4e08-8783-f68ae63af8c8","Type":"ContainerStarted","Data":"129fa9cfa1c2f324df3ec62aca8c2aa1d54455380d25dfe49b89b2b77d9074a1"} Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.464871 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" podStartSLOduration=189.464850357 podStartE2EDuration="3m9.464850357s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:51.461884185 +0000 UTC m=+242.475276075" watchObservedRunningTime="2026-02-25 15:49:51.464850357 +0000 UTC m=+242.478242247" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.467310 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7drd4" event={"ID":"0f9fc900-6cf7-4890-8e7d-6925e9e3862f","Type":"ContainerStarted","Data":"beaa454a88a090163814ae733c72655e903db60d655f114376244da349996b0b"} Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.470543 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.470758 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.473004 4937 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-5jprg container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.18:8443/livez\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.473046 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" podUID="f8285777-1554-41ed-8fef-daf8637a4c5d" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.18:8443/livez\": dial tcp 10.217.0.18:8443: connect: connection refused" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.475342 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-b4jvp" event={"ID":"dba71048-faea-4ee2-bec3-70c2fa66a7e8","Type":"ContainerStarted","Data":"8c3faefe2e6db7aff05a02156a236d40a425aa7d5daab1f7f0c767ba087152f4"} Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.488218 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9nd26" event={"ID":"62b20b63-dccd-4dd2-94c5-c89bf4d87585","Type":"ContainerStarted","Data":"08da0a4306d6ab126e546cb015b596d729dea0e4f225c2b0c636c1611c474eda"} Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.489333 4937 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hhvn4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.489353 4937 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-crxdn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.489380 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" podUID="727110e8-1674-467e-b39a-0fca0b874523" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.489399 4937 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-fzfv4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.489428 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" podUID="d9c7aa27-d268-45e4-be93-97de8f8dfb8c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.489422 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" podUID="49fa31be-2461-4113-96b3-a1da363827c7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.489513 4937 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r5bpn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.489527 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.490257 4937 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-lgfgx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.490309 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" podUID="bf09db34-1df7-44a2-a584-a032476e4d66" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.492227 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l66s9" podStartSLOduration=189.492209139 podStartE2EDuration="3m9.492209139s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:51.489599376 +0000 UTC m=+242.502991266" watchObservedRunningTime="2026-02-25 15:49:51.492209139 +0000 UTC m=+242.505601029" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.514755 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.515588 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.015559094 +0000 UTC m=+243.028950984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.515689 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.516166 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.016150259 +0000 UTC m=+243.029542149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.537099 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-b4jvp" podStartSLOduration=189.537063815 podStartE2EDuration="3m9.537063815s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:51.530072445 +0000 UTC m=+242.543464355" watchObservedRunningTime="2026-02-25 15:49:51.537063815 +0000 UTC m=+242.550455695" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.581430 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9nd26" podStartSLOduration=8.581410298 podStartE2EDuration="8.581410298s" podCreationTimestamp="2026-02-25 15:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:51.552801996 +0000 UTC m=+242.566193876" watchObservedRunningTime="2026-02-25 15:49:51.581410298 +0000 UTC m=+242.594802188" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.617239 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.617405 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.117369898 +0000 UTC m=+243.130761788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.618607 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.635023 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.134977834 +0000 UTC m=+243.148369724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.635567 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vk5tb" podStartSLOduration=189.635540868 podStartE2EDuration="3m9.635540868s" podCreationTimestamp="2026-02-25 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:51.614707464 +0000 UTC m=+242.628099374" watchObservedRunningTime="2026-02-25 15:49:51.635540868 +0000 UTC m=+242.648932758" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.735993 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.737530 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.237513975 +0000 UTC m=+243.250905865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.779043 4937 ???:1] "http: TLS handshake error from 192.168.126.11:41874: no serving certificate available for the kubelet" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.839734 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.841145 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.341122243 +0000 UTC m=+243.354514133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.897824 4937 ???:1] "http: TLS handshake error from 192.168.126.11:41888: no serving certificate available for the kubelet" Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.949079 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.949584 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.449559867 +0000 UTC m=+243.462951757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.949745 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:51 crc kubenswrapper[4937]: E0225 15:49:51.950202 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.450194842 +0000 UTC m=+243.463586732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:51 crc kubenswrapper[4937]: I0225 15:49:51.994428 4937 ???:1] "http: TLS handshake error from 192.168.126.11:41890: no serving certificate available for the kubelet" Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.050006 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.050091 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.051288 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:52 crc kubenswrapper[4937]: E0225 15:49:52.051496 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.551447432 +0000 UTC m=+243.564839322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.051694 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:52 crc kubenswrapper[4937]: E0225 15:49:52.052201 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.55219346 +0000 UTC m=+243.565585350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.096948 4937 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-zv69t container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.097011 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" podUID="92721dbb-2c2a-448a-801f-579a9d2d9566" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.097129 4937 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-zv69t container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.097147 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" podUID="92721dbb-2c2a-448a-801f-579a9d2d9566" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.101842 4937 ???:1] "http: TLS handshake error from 192.168.126.11:41896: no serving certificate available for the kubelet" Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.153226 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:52 crc kubenswrapper[4937]: E0225 15:49:52.153794 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.653764148 +0000 UTC m=+243.667156038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.199601 4937 ???:1] "http: TLS handshake error from 192.168.126.11:41912: no serving certificate available for the kubelet" Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.255739 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:52 crc kubenswrapper[4937]: E0225 15:49:52.256250 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.756232268 +0000 UTC m=+243.769624158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.309433 4937 ???:1] "http: TLS handshake error from 192.168.126.11:41914: no serving certificate available for the kubelet" Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.356439 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:52 crc kubenswrapper[4937]: E0225 15:49:52.356698 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.856655568 +0000 UTC m=+243.870047458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.356831 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:52 crc kubenswrapper[4937]: E0225 15:49:52.357154 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.85713658 +0000 UTC m=+243.870528470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.458529 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:52 crc kubenswrapper[4937]: E0225 15:49:52.459088 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:52.959066497 +0000 UTC m=+243.972458377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.507966 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7drd4" event={"ID":"0f9fc900-6cf7-4890-8e7d-6925e9e3862f","Type":"ContainerStarted","Data":"1ce745e97ba42c777e1ce069fa1df71a467b26ec56bb5e0d135608f55882caf1"} Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.519506 4937 ???:1] "http: TLS handshake error from 192.168.126.11:41926: no serving certificate available for the kubelet" Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.524696 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-48pnj" event={"ID":"816999fe-cb2a-4f9b-b546-ca866b5aec3a","Type":"ContainerStarted","Data":"4c427a613485056d94f311d831cd752c939afc94228f2fc51c9a77531c887124"} Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.525835 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-9nd26" Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.555058 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-7drd4" podStartSLOduration=191.555028209 podStartE2EDuration="3m11.555028209s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:52.552080557 +0000 UTC m=+243.565472467" watchObservedRunningTime="2026-02-25 15:49:52.555028209 +0000 UTC m=+243.568420099" Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.561120 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:52 crc kubenswrapper[4937]: E0225 15:49:52.561625 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:53.061603978 +0000 UTC m=+244.074995868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.667814 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:52 crc kubenswrapper[4937]: E0225 15:49:52.669117 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:53.169080909 +0000 UTC m=+244.182472829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.769586 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:52 crc kubenswrapper[4937]: E0225 15:49:52.770611 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:53.270597745 +0000 UTC m=+244.283989635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.871873 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:52 crc kubenswrapper[4937]: E0225 15:49:52.872067 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:53.37203816 +0000 UTC m=+244.385430050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.872242 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:52 crc kubenswrapper[4937]: E0225 15:49:52.872594 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:53.372585944 +0000 UTC m=+244.385977824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.879736 4937 ???:1] "http: TLS handshake error from 192.168.126.11:41938: no serving certificate available for the kubelet" Feb 25 15:49:52 crc kubenswrapper[4937]: I0225 15:49:52.973987 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:52 crc kubenswrapper[4937]: E0225 15:49:52.974386 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:53.474365136 +0000 UTC m=+244.487757036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.052429 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:49:53 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:49:53 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:49:53 crc kubenswrapper[4937]: healthz check failed Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.052475 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.075526 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.076116 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:53.576092858 +0000 UTC m=+244.589484748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.177028 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.177244 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:53.677210275 +0000 UTC m=+244.690602175 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.177434 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.177866 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:53.677849871 +0000 UTC m=+244.691241821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.278977 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.279464 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:53.779426449 +0000 UTC m=+244.792818339 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.279624 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.279943 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:53.779935281 +0000 UTC m=+244.793327171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.380672 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.381137 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:53.881115 +0000 UTC m=+244.894506890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.483000 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.483595 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:53.983579219 +0000 UTC m=+244.996971109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.584364 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.585045 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.085018054 +0000 UTC m=+245.098409944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.627998 4937 ???:1] "http: TLS handshake error from 192.168.126.11:41954: no serving certificate available for the kubelet" Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.685875 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.688529 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.188505898 +0000 UTC m=+245.201897778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.787467 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.787718 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.287670788 +0000 UTC m=+245.301062668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.788112 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.788513 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.288497038 +0000 UTC m=+245.301888928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.888949 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.889215 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.389176414 +0000 UTC m=+245.402568304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.889576 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.890229 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.390205009 +0000 UTC m=+245.403596899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.991586 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.991862 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.491814868 +0000 UTC m=+245.505206758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:53 crc kubenswrapper[4937]: I0225 15:49:53.992221 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:53 crc kubenswrapper[4937]: E0225 15:49:53.992689 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.492672449 +0000 UTC m=+245.506064329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.053355 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:49:54 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:49:54 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:49:54 crc kubenswrapper[4937]: healthz check failed Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.053404 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.093576 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.093771 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.593745895 +0000 UTC m=+245.607137785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.093838 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.094202 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.594191046 +0000 UTC m=+245.607582936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.194417 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.194669 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.694627436 +0000 UTC m=+245.708019326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.194805 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.195757 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.695738383 +0000 UTC m=+245.709130463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.297053 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.297279 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.797246749 +0000 UTC m=+245.810638649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.297399 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.297786 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.797776652 +0000 UTC m=+245.811168542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.399414 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.399688 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.899659108 +0000 UTC m=+245.913050998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.400021 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.400450 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:54.900438417 +0000 UTC m=+245.913830507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.501445 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.501688 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.001647816 +0000 UTC m=+246.015039706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.501893 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.502518 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.002474986 +0000 UTC m=+246.015866876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.561074 4937 generic.go:334] "Generic (PLEG): container finished" podID="d3b2c333-3db5-4de3-bcc1-944dfc35b2b3" containerID="48d01a04fa661c299849ab0a5e5e6fdd42337fce0475e6e6a26fe17d302c978d" exitCode=0 Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.561139 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" event={"ID":"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3","Type":"ContainerDied","Data":"48d01a04fa661c299849ab0a5e5e6fdd42337fce0475e6e6a26fe17d302c978d"} Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.602936 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.603252 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.103203003 +0000 UTC m=+246.116594903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.603722 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.604048 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.104033433 +0000 UTC m=+246.117425323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.712105 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.712330 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.212287503 +0000 UTC m=+246.225679393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.712459 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.713210 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.213202725 +0000 UTC m=+246.226594615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.813784 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.813951 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.313927172 +0000 UTC m=+246.327319062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.814164 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.814522 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.314514816 +0000 UTC m=+246.327906706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.851199 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-54sqd"] Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.852126 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.856333 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.915852 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:54 crc kubenswrapper[4937]: E0225 15:49:54.916309 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.416288349 +0000 UTC m=+246.429680239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.919568 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-29fxd"] Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.919841 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" podUID="430d304c-8623-4d01-a878-5db061d6a5b8" containerName="controller-manager" containerID="cri-o://d9a00f77a29ed0b768af3ba4dbc05c2b18ada066d4c6e1881b6308ea408c06ca" gracePeriod=30 Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.929383 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.953563 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx"] Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.953895 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" podUID="bf09db34-1df7-44a2-a584-a032476e4d66" containerName="route-controller-manager" containerID="cri-o://8ec9fc667000c543a11830f5cf34e7f3de85b13999fd38ea9a7d8b92f9d035ca" gracePeriod=30 Feb 25 15:49:54 crc kubenswrapper[4937]: I0225 15:49:54.963883 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.018540 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-utilities\") pod \"certified-operators-54sqd\" (UID: \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\") " pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.018660 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.018720 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-catalog-content\") pod \"certified-operators-54sqd\" (UID: \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\") " pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.018747 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4vcl\" (UniqueName: \"kubernetes.io/projected/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-kube-api-access-h4vcl\") pod \"certified-operators-54sqd\" (UID: \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\") " pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:49:55 crc kubenswrapper[4937]: E0225 15:49:55.019007 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.518988484 +0000 UTC m=+246.532380364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.022143 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54sqd"] Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.025851 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l8xkp"] Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.026840 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.032378 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.044837 4937 ???:1] "http: TLS handshake error from 192.168.126.11:41964: no serving certificate available for the kubelet" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.045339 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l8xkp"] Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.052621 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:49:55 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:49:55 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:49:55 crc kubenswrapper[4937]: healthz check failed Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.052702 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.120549 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:55 crc kubenswrapper[4937]: E0225 15:49:55.120577 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.620561012 +0000 UTC m=+246.633952892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.120996 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-utilities\") pod \"certified-operators-54sqd\" (UID: \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\") " pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.121137 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppp72\" (UniqueName: \"kubernetes.io/projected/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-kube-api-access-ppp72\") pod \"community-operators-l8xkp\" (UID: \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\") " pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.121233 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-utilities\") pod \"community-operators-l8xkp\" (UID: \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\") " pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.121261 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.121360 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-catalog-content\") pod \"community-operators-l8xkp\" (UID: \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\") " pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.121410 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-catalog-content\") pod \"certified-operators-54sqd\" (UID: \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\") " pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.121455 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4vcl\" (UniqueName: \"kubernetes.io/projected/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-kube-api-access-h4vcl\") pod \"certified-operators-54sqd\" (UID: \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\") " pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.122428 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-utilities\") pod \"certified-operators-54sqd\" (UID: \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\") " pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:49:55 crc kubenswrapper[4937]: E0225 15:49:55.122831 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.622816197 +0000 UTC m=+246.636208077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.123107 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-catalog-content\") pod \"certified-operators-54sqd\" (UID: \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\") " pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.133798 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zv69t" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.173837 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4vcl\" (UniqueName: \"kubernetes.io/projected/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-kube-api-access-h4vcl\") pod \"certified-operators-54sqd\" (UID: \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\") " pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.182164 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c4b95"] Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.183717 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.200254 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c4b95"] Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.222685 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.223299 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppp72\" (UniqueName: \"kubernetes.io/projected/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-kube-api-access-ppp72\") pod \"community-operators-l8xkp\" (UID: \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\") " pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.223390 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-utilities\") pod \"community-operators-l8xkp\" (UID: \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\") " pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.223496 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-catalog-content\") pod \"community-operators-l8xkp\" (UID: \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\") " pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.224131 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-catalog-content\") pod \"community-operators-l8xkp\" (UID: \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\") " pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:49:55 crc kubenswrapper[4937]: E0225 15:49:55.224226 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.72420468 +0000 UTC m=+246.737596570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.225971 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-utilities\") pod \"community-operators-l8xkp\" (UID: \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\") " pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.260389 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppp72\" (UniqueName: \"kubernetes.io/projected/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-kube-api-access-ppp72\") pod \"community-operators-l8xkp\" (UID: \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\") " pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.329261 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs9zp\" (UniqueName: \"kubernetes.io/projected/2506466d-db79-4b0e-a2df-2d64c56ad7cd-kube-api-access-zs9zp\") pod \"certified-operators-c4b95\" (UID: \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\") " pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.329362 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.329416 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2506466d-db79-4b0e-a2df-2d64c56ad7cd-utilities\") pod \"certified-operators-c4b95\" (UID: \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\") " pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.329452 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2506466d-db79-4b0e-a2df-2d64c56ad7cd-catalog-content\") pod \"certified-operators-c4b95\" (UID: \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\") " pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:49:55 crc kubenswrapper[4937]: E0225 15:49:55.329763 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.829748274 +0000 UTC m=+246.843140164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.375036 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sbgvv"] Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.376801 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.384556 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sbgvv"] Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.402025 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.432473 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:55 crc kubenswrapper[4937]: E0225 15:49:55.432684 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.932645444 +0000 UTC m=+246.946037334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.433107 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2506466d-db79-4b0e-a2df-2d64c56ad7cd-utilities\") pod \"certified-operators-c4b95\" (UID: \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\") " pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.433235 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2506466d-db79-4b0e-a2df-2d64c56ad7cd-catalog-content\") pod \"certified-operators-c4b95\" (UID: \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\") " pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.433392 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs9zp\" (UniqueName: \"kubernetes.io/projected/2506466d-db79-4b0e-a2df-2d64c56ad7cd-kube-api-access-zs9zp\") pod \"certified-operators-c4b95\" (UID: \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\") " pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.433552 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.433718 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2506466d-db79-4b0e-a2df-2d64c56ad7cd-utilities\") pod \"certified-operators-c4b95\" (UID: \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\") " pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.433843 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2506466d-db79-4b0e-a2df-2d64c56ad7cd-catalog-content\") pod \"certified-operators-c4b95\" (UID: \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\") " pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:49:55 crc kubenswrapper[4937]: E0225 15:49:55.434219 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:55.934202272 +0000 UTC m=+246.947594152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.451853 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs9zp\" (UniqueName: \"kubernetes.io/projected/2506466d-db79-4b0e-a2df-2d64c56ad7cd-kube-api-access-zs9zp\") pod \"certified-operators-c4b95\" (UID: \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\") " pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.473923 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.536028 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.536424 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-utilities\") pod \"community-operators-sbgvv\" (UID: \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\") " pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.536586 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8d7b\" (UniqueName: \"kubernetes.io/projected/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-kube-api-access-v8d7b\") pod \"community-operators-sbgvv\" (UID: \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\") " pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.536648 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-catalog-content\") pod \"community-operators-sbgvv\" (UID: \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\") " pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:49:55 crc kubenswrapper[4937]: E0225 15:49:55.536794 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:56.036768834 +0000 UTC m=+247.050160724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.580551 4937 generic.go:334] "Generic (PLEG): container finished" podID="430d304c-8623-4d01-a878-5db061d6a5b8" containerID="d9a00f77a29ed0b768af3ba4dbc05c2b18ada066d4c6e1881b6308ea408c06ca" exitCode=0 Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.580780 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" event={"ID":"430d304c-8623-4d01-a878-5db061d6a5b8","Type":"ContainerDied","Data":"d9a00f77a29ed0b768af3ba4dbc05c2b18ada066d4c6e1881b6308ea408c06ca"} Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.600965 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.606598 4937 generic.go:334] "Generic (PLEG): container finished" podID="bf09db34-1df7-44a2-a584-a032476e4d66" containerID="8ec9fc667000c543a11830f5cf34e7f3de85b13999fd38ea9a7d8b92f9d035ca" exitCode=0 Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.606983 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" event={"ID":"bf09db34-1df7-44a2-a584-a032476e4d66","Type":"ContainerDied","Data":"8ec9fc667000c543a11830f5cf34e7f3de85b13999fd38ea9a7d8b92f9d035ca"} Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.637758 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.637799 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8d7b\" (UniqueName: \"kubernetes.io/projected/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-kube-api-access-v8d7b\") pod \"community-operators-sbgvv\" (UID: \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\") " pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.637848 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-catalog-content\") pod \"community-operators-sbgvv\" (UID: \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\") " pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.637874 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-utilities\") pod \"community-operators-sbgvv\" (UID: \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\") " pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:49:55 crc kubenswrapper[4937]: E0225 15:49:55.638193 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:56.138177148 +0000 UTC m=+247.151569038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.638680 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-catalog-content\") pod \"community-operators-sbgvv\" (UID: \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\") " pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.639117 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-utilities\") pod \"community-operators-sbgvv\" (UID: \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\") " pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.678123 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8d7b\" (UniqueName: \"kubernetes.io/projected/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-kube-api-access-v8d7b\") pod \"community-operators-sbgvv\" (UID: \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\") " pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.697752 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.726267 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l8xkp"] Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.740298 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:55 crc kubenswrapper[4937]: E0225 15:49:55.741013 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:56.240995916 +0000 UTC m=+247.254387806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.742909 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.744581 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 15:49:55 crc kubenswrapper[4937]: W0225 15:49:55.745477 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod534065ad_eb70_4ddb_bbf6_b9dbcfc2dc15.slice/crio-9dd9f93e5c1ebefdb1b24a9366cd82c7d13f957fef131406131ae48b58f5402c WatchSource:0}: Error finding container 9dd9f93e5c1ebefdb1b24a9366cd82c7d13f957fef131406131ae48b58f5402c: Status 404 returned error can't find the container with id 9dd9f93e5c1ebefdb1b24a9366cd82c7d13f957fef131406131ae48b58f5402c Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.748354 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.748569 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.754014 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.809961 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54sqd"] Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.831232 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.831270 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.835925 4937 patch_prober.go:28] interesting pod/console-f9d7485db-djs85 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.835996 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-djs85" podUID="ff089f24-3d05-4c97-b6f7-3a39cbec049f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.842222 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3418c5c9-1699-469e-8fbb-78a8bce52af3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3418c5c9-1699-469e-8fbb-78a8bce52af3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.842275 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3418c5c9-1699-469e-8fbb-78a8bce52af3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3418c5c9-1699-469e-8fbb-78a8bce52af3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.842349 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:55 crc kubenswrapper[4937]: E0225 15:49:55.842707 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:56.342693617 +0000 UTC m=+247.356085507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.943299 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.943619 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3418c5c9-1699-469e-8fbb-78a8bce52af3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3418c5c9-1699-469e-8fbb-78a8bce52af3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.943646 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3418c5c9-1699-469e-8fbb-78a8bce52af3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3418c5c9-1699-469e-8fbb-78a8bce52af3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 15:49:55 crc kubenswrapper[4937]: E0225 15:49:55.945072 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:56.445022304 +0000 UTC m=+247.458414194 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.945396 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3418c5c9-1699-469e-8fbb-78a8bce52af3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3418c5c9-1699-469e-8fbb-78a8bce52af3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 15:49:55 crc kubenswrapper[4937]: I0225 15:49:55.978850 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3418c5c9-1699-469e-8fbb-78a8bce52af3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3418c5c9-1699-469e-8fbb-78a8bce52af3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.017833 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-znkpp" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.045359 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.045884 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:56.545869614 +0000 UTC m=+247.559261504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.058200 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:49:56 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:49:56 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:49:56 crc kubenswrapper[4937]: healthz check failed Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.058657 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.070361 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.077529 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c4b95"] Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.086462 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 15:49:56 crc kubenswrapper[4937]: W0225 15:49:56.086870 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2506466d_db79_4b0e_a2df_2d64c56ad7cd.slice/crio-302d6d8cd67bdd1998eac072ae450db6cf29604320014c119f91dc4f1636624b WatchSource:0}: Error finding container 302d6d8cd67bdd1998eac072ae450db6cf29604320014c119f91dc4f1636624b: Status 404 returned error can't find the container with id 302d6d8cd67bdd1998eac072ae450db6cf29604320014c119f91dc4f1636624b Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.093691 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.093768 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.094071 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.095538 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.146336 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-secret-volume\") pod \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\" (UID: \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.146398 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gw42\" (UniqueName: \"kubernetes.io/projected/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-kube-api-access-2gw42\") pod \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\" (UID: \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.146610 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.146634 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-config-volume\") pod \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\" (UID: \"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3\") " Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.147096 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:56.647065983 +0000 UTC m=+247.660457873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.147611 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.147868 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-config-volume" (OuterVolumeSpecName: "config-volume") pod "d3b2c333-3db5-4de3-bcc1-944dfc35b2b3" (UID: "d3b2c333-3db5-4de3-bcc1-944dfc35b2b3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.148043 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:56.648030606 +0000 UTC m=+247.661422566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.153243 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d3b2c333-3db5-4de3-bcc1-944dfc35b2b3" (UID: "d3b2c333-3db5-4de3-bcc1-944dfc35b2b3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.153336 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-kube-api-access-2gw42" (OuterVolumeSpecName: "kube-api-access-2gw42") pod "d3b2c333-3db5-4de3-bcc1-944dfc35b2b3" (UID: "d3b2c333-3db5-4de3-bcc1-944dfc35b2b3"). InnerVolumeSpecName "kube-api-access-2gw42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.249319 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.249651 4937 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.249665 4937 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.249677 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gw42\" (UniqueName: \"kubernetes.io/projected/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3-kube-api-access-2gw42\") on node \"crc\" DevicePath \"\"" Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.249751 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:56.749735027 +0000 UTC m=+247.763126917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.279110 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.281007 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.352374 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.352807 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:56.852787331 +0000 UTC m=+247.866179221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.355170 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sbgvv"] Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.442954 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.443099 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.454926 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.454988 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/430d304c-8623-4d01-a878-5db061d6a5b8-serving-cert\") pod \"430d304c-8623-4d01-a878-5db061d6a5b8\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.455020 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf09db34-1df7-44a2-a584-a032476e4d66-config\") pod \"bf09db34-1df7-44a2-a584-a032476e4d66\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.455052 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s24f\" (UniqueName: \"kubernetes.io/projected/430d304c-8623-4d01-a878-5db061d6a5b8-kube-api-access-8s24f\") pod \"430d304c-8623-4d01-a878-5db061d6a5b8\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.455130 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:56.955095387 +0000 UTC m=+247.968487277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.455173 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-config\") pod \"430d304c-8623-4d01-a878-5db061d6a5b8\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.455251 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-client-ca\") pod \"430d304c-8623-4d01-a878-5db061d6a5b8\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.455302 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-proxy-ca-bundles\") pod \"430d304c-8623-4d01-a878-5db061d6a5b8\" (UID: \"430d304c-8623-4d01-a878-5db061d6a5b8\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.455335 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf09db34-1df7-44a2-a584-a032476e4d66-serving-cert\") pod \"bf09db34-1df7-44a2-a584-a032476e4d66\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.455383 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf09db34-1df7-44a2-a584-a032476e4d66-client-ca\") pod \"bf09db34-1df7-44a2-a584-a032476e4d66\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.455420 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcfj6\" (UniqueName: \"kubernetes.io/projected/bf09db34-1df7-44a2-a584-a032476e4d66-kube-api-access-kcfj6\") pod \"bf09db34-1df7-44a2-a584-a032476e4d66\" (UID: \"bf09db34-1df7-44a2-a584-a032476e4d66\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.455937 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.456453 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:56.9564403 +0000 UTC m=+247.969832190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.456579 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf09db34-1df7-44a2-a584-a032476e4d66-config" (OuterVolumeSpecName: "config") pod "bf09db34-1df7-44a2-a584-a032476e4d66" (UID: "bf09db34-1df7-44a2-a584-a032476e4d66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.462265 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-config" (OuterVolumeSpecName: "config") pod "430d304c-8623-4d01-a878-5db061d6a5b8" (UID: "430d304c-8623-4d01-a878-5db061d6a5b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.462263 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf09db34-1df7-44a2-a584-a032476e4d66-client-ca" (OuterVolumeSpecName: "client-ca") pod "bf09db34-1df7-44a2-a584-a032476e4d66" (UID: "bf09db34-1df7-44a2-a584-a032476e4d66"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.463001 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/430d304c-8623-4d01-a878-5db061d6a5b8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "430d304c-8623-4d01-a878-5db061d6a5b8" (UID: "430d304c-8623-4d01-a878-5db061d6a5b8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.463054 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "430d304c-8623-4d01-a878-5db061d6a5b8" (UID: "430d304c-8623-4d01-a878-5db061d6a5b8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.463103 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-client-ca" (OuterVolumeSpecName: "client-ca") pod "430d304c-8623-4d01-a878-5db061d6a5b8" (UID: "430d304c-8623-4d01-a878-5db061d6a5b8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.464581 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/430d304c-8623-4d01-a878-5db061d6a5b8-kube-api-access-8s24f" (OuterVolumeSpecName: "kube-api-access-8s24f") pod "430d304c-8623-4d01-a878-5db061d6a5b8" (UID: "430d304c-8623-4d01-a878-5db061d6a5b8"). InnerVolumeSpecName "kube-api-access-8s24f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.468281 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf09db34-1df7-44a2-a584-a032476e4d66-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bf09db34-1df7-44a2-a584-a032476e4d66" (UID: "bf09db34-1df7-44a2-a584-a032476e4d66"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.476225 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf09db34-1df7-44a2-a584-a032476e4d66-kube-api-access-kcfj6" (OuterVolumeSpecName: "kube-api-access-kcfj6") pod "bf09db34-1df7-44a2-a584-a032476e4d66" (UID: "bf09db34-1df7-44a2-a584-a032476e4d66"). InnerVolumeSpecName "kube-api-access-kcfj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.485943 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.489915 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.491475 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5jprg" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.558976 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.559474 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf09db34-1df7-44a2-a584-a032476e4d66-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.559514 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcfj6\" (UniqueName: \"kubernetes.io/projected/bf09db34-1df7-44a2-a584-a032476e4d66-kube-api-access-kcfj6\") on node \"crc\" DevicePath \"\"" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.559527 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/430d304c-8623-4d01-a878-5db061d6a5b8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.559545 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf09db34-1df7-44a2-a584-a032476e4d66-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.559555 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s24f\" (UniqueName: \"kubernetes.io/projected/430d304c-8623-4d01-a878-5db061d6a5b8-kube-api-access-8s24f\") on node \"crc\" DevicePath \"\"" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.559565 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.559574 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.559582 4937 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/430d304c-8623-4d01-a878-5db061d6a5b8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.559592 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf09db34-1df7-44a2-a584-a032476e4d66-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.561071 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:57.061049261 +0000 UTC m=+248.074441151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.632750 4937 generic.go:334] "Generic (PLEG): container finished" podID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" containerID="254d37e2a8361ca3a196ad356445a1d3c07eb809914a428bc5b0591de538480b" exitCode=0 Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.633174 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4b95" event={"ID":"2506466d-db79-4b0e-a2df-2d64c56ad7cd","Type":"ContainerDied","Data":"254d37e2a8361ca3a196ad356445a1d3c07eb809914a428bc5b0591de538480b"} Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.633255 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4b95" event={"ID":"2506466d-db79-4b0e-a2df-2d64c56ad7cd","Type":"ContainerStarted","Data":"302d6d8cd67bdd1998eac072ae450db6cf29604320014c119f91dc4f1636624b"} Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.650521 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-48pnj" event={"ID":"816999fe-cb2a-4f9b-b546-ca866b5aec3a","Type":"ContainerStarted","Data":"625f85e5421102f31f856eba068b4ff8f1ace3791269ec07df56dc5d5b1303ef"} Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.654074 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbgvv" event={"ID":"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9","Type":"ContainerStarted","Data":"52fea6f79b46519ec210820a602341ea5e1eabe6a24e7ada5087bdcbefa4ad38"} Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.662189 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.663063 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:57.163037659 +0000 UTC m=+248.176429719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.665734 4937 generic.go:334] "Generic (PLEG): container finished" podID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" containerID="e13bcce78b2fbe9c228202da6952862f6a9736a03c836494aa4cc106e67db74b" exitCode=0 Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.665833 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54sqd" event={"ID":"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c","Type":"ContainerDied","Data":"e13bcce78b2fbe9c228202da6952862f6a9736a03c836494aa4cc106e67db74b"} Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.665871 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54sqd" event={"ID":"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c","Type":"ContainerStarted","Data":"72df2f9c05873a1ef6f206ebb11f72d680317d19b0fc9a3261249d147dccc712"} Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.676638 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.677868 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw" event={"ID":"d3b2c333-3db5-4de3-bcc1-944dfc35b2b3","Type":"ContainerDied","Data":"b861db428d3a67036ef53124ec0ad9b95bf73b7d62362fd3d0fa25a628c2a61a"} Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.677964 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b861db428d3a67036ef53124ec0ad9b95bf73b7d62362fd3d0fa25a628c2a61a" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.680879 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3418c5c9-1699-469e-8fbb-78a8bce52af3","Type":"ContainerStarted","Data":"0de9bb4a60d57e9fcd417cebf955f36be45a750dd13e90727504d8729a4f99fd"} Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.683749 4937 generic.go:334] "Generic (PLEG): container finished" podID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" containerID="0b24f47aca9f8a7cd478f192b8c4f1c809ef2760c03e7a8fc5afa608bee01929" exitCode=0 Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.683804 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8xkp" event={"ID":"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15","Type":"ContainerDied","Data":"0b24f47aca9f8a7cd478f192b8c4f1c809ef2760c03e7a8fc5afa608bee01929"} Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.683822 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8xkp" event={"ID":"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15","Type":"ContainerStarted","Data":"9dd9f93e5c1ebefdb1b24a9366cd82c7d13f957fef131406131ae48b58f5402c"} Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.689528 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" event={"ID":"bf09db34-1df7-44a2-a584-a032476e4d66","Type":"ContainerDied","Data":"9126f0d74fec251ab5643401a4bcc9193542accf0c4b0d5fe602920197cbef3a"} Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.689589 4937 scope.go:117] "RemoveContainer" containerID="8ec9fc667000c543a11830f5cf34e7f3de85b13999fd38ea9a7d8b92f9d035ca" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.689767 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.706516 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.709873 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" event={"ID":"430d304c-8623-4d01-a878-5db061d6a5b8","Type":"ContainerDied","Data":"6863f3ae6615f3ee645173a9a18e865a5da82aec7a20d7156ead0ffb4357fc2e"} Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.736695 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx"] Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.740149 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx"] Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.751701 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-29fxd"] Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.754839 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-29fxd"] Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.763744 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.763890 4937 scope.go:117] "RemoveContainer" containerID="d9a00f77a29ed0b768af3ba4dbc05c2b18ada066d4c6e1881b6308ea408c06ca" Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.763946 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:57.26391693 +0000 UTC m=+248.277308820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.764243 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.764753 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:57.26473918 +0000 UTC m=+248.278131070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.866183 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.866247 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.866420 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3b2c333-3db5-4de3-bcc1-944dfc35b2b3" containerName="collect-profiles" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.866430 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3b2c333-3db5-4de3-bcc1-944dfc35b2b3" containerName="collect-profiles" Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.866445 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="430d304c-8623-4d01-a878-5db061d6a5b8" containerName="controller-manager" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.866451 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="430d304c-8623-4d01-a878-5db061d6a5b8" containerName="controller-manager" Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.866460 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf09db34-1df7-44a2-a584-a032476e4d66" containerName="route-controller-manager" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.866467 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf09db34-1df7-44a2-a584-a032476e4d66" containerName="route-controller-manager" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.866571 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf09db34-1df7-44a2-a584-a032476e4d66" containerName="route-controller-manager" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.866591 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="430d304c-8623-4d01-a878-5db061d6a5b8" containerName="controller-manager" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.866600 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3b2c333-3db5-4de3-bcc1-944dfc35b2b3" containerName="collect-profiles" Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.866772 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:57.366759809 +0000 UTC m=+248.380151699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.866889 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.869472 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.869633 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.885619 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.917561 4937 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-29fxd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.917628 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-29fxd" podUID="430d304c-8623-4d01-a878-5db061d6a5b8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.934470 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-crxdn" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.963381 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gsxxs"] Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.964426 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.967260 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.967338 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b14c9c2b-b2a7-413e-b94d-be16cb50eeed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b14c9c2b-b2a7-413e-b94d-be16cb50eeed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.967436 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b14c9c2b-b2a7-413e-b94d-be16cb50eeed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b14c9c2b-b2a7-413e-b94d-be16cb50eeed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 15:49:56 crc kubenswrapper[4937]: E0225 15:49:56.967902 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:57.467887936 +0000 UTC m=+248.481279826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.968121 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 25 15:49:56 crc kubenswrapper[4937]: I0225 15:49:56.978057 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsxxs"] Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.004945 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fzfv4" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.041041 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr"] Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.041908 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.044442 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.044745 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.045334 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.045706 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.046766 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.047216 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.051968 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.064671 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:49:57 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:49:57 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:49:57 crc kubenswrapper[4937]: healthz check failed Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.064769 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.066587 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.068836 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.069292 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-catalog-content\") pod \"redhat-marketplace-gsxxs\" (UID: \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\") " pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.069620 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b14c9c2b-b2a7-413e-b94d-be16cb50eeed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b14c9c2b-b2a7-413e-b94d-be16cb50eeed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 15:49:57 crc kubenswrapper[4937]: E0225 15:49:57.069769 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:57.56970344 +0000 UTC m=+248.583095350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.070029 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zktg\" (UniqueName: \"kubernetes.io/projected/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-kube-api-access-7zktg\") pod \"redhat-marketplace-gsxxs\" (UID: \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\") " pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.070115 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-utilities\") pod \"redhat-marketplace-gsxxs\" (UID: \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\") " pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.070209 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b14c9c2b-b2a7-413e-b94d-be16cb50eeed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b14c9c2b-b2a7-413e-b94d-be16cb50eeed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.070365 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b14c9c2b-b2a7-413e-b94d-be16cb50eeed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b14c9c2b-b2a7-413e-b94d-be16cb50eeed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.071336 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9"] Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.080190 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.084898 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.085570 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr"] Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.085684 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.086210 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.086942 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.087171 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.087456 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.097398 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9"] Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.101379 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b14c9c2b-b2a7-413e-b94d-be16cb50eeed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b14c9c2b-b2a7-413e-b94d-be16cb50eeed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.171569 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.171626 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/611bc243-950b-47fd-8341-f4a3ae40e27e-config\") pod \"route-controller-manager-7bb6fb86f5-w5qd9\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.171675 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-config\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.171720 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611bc243-950b-47fd-8341-f4a3ae40e27e-serving-cert\") pod \"route-controller-manager-7bb6fb86f5-w5qd9\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.171833 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-proxy-ca-bundles\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: E0225 15:49:57.171925 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:57.671908193 +0000 UTC m=+248.685300083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.171969 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/611bc243-950b-47fd-8341-f4a3ae40e27e-client-ca\") pod \"route-controller-manager-7bb6fb86f5-w5qd9\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.172098 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zktg\" (UniqueName: \"kubernetes.io/projected/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-kube-api-access-7zktg\") pod \"redhat-marketplace-gsxxs\" (UID: \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\") " pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.172129 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4250b20-1a8c-4c0c-88c0-2598c1e07503-serving-cert\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.172185 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-utilities\") pod \"redhat-marketplace-gsxxs\" (UID: \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\") " pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.172277 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-client-ca\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.172341 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq2ck\" (UniqueName: \"kubernetes.io/projected/611bc243-950b-47fd-8341-f4a3ae40e27e-kube-api-access-tq2ck\") pod \"route-controller-manager-7bb6fb86f5-w5qd9\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.172359 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6tzq\" (UniqueName: \"kubernetes.io/projected/f4250b20-1a8c-4c0c-88c0-2598c1e07503-kube-api-access-v6tzq\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.172382 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-catalog-content\") pod \"redhat-marketplace-gsxxs\" (UID: \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\") " pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.173513 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-catalog-content\") pod \"redhat-marketplace-gsxxs\" (UID: \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\") " pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.173584 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-utilities\") pod \"redhat-marketplace-gsxxs\" (UID: \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\") " pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.174134 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hhvn4" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.192396 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zktg\" (UniqueName: \"kubernetes.io/projected/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-kube-api-access-7zktg\") pod \"redhat-marketplace-gsxxs\" (UID: \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\") " pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.202576 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.239343 4937 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-lgfgx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.239412 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgfgx" podUID="bf09db34-1df7-44a2-a584-a032476e4d66" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.274072 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.274304 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4250b20-1a8c-4c0c-88c0-2598c1e07503-serving-cert\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: E0225 15:49:57.274404 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:57.774358263 +0000 UTC m=+248.787750153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.274507 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-client-ca\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.274632 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq2ck\" (UniqueName: \"kubernetes.io/projected/611bc243-950b-47fd-8341-f4a3ae40e27e-kube-api-access-tq2ck\") pod \"route-controller-manager-7bb6fb86f5-w5qd9\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.274661 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6tzq\" (UniqueName: \"kubernetes.io/projected/f4250b20-1a8c-4c0c-88c0-2598c1e07503-kube-api-access-v6tzq\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.275131 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.275193 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/611bc243-950b-47fd-8341-f4a3ae40e27e-config\") pod \"route-controller-manager-7bb6fb86f5-w5qd9\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:57 crc kubenswrapper[4937]: E0225 15:49:57.275521 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:57.77549826 +0000 UTC m=+248.788890150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.276548 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-config\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.276631 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611bc243-950b-47fd-8341-f4a3ae40e27e-serving-cert\") pod \"route-controller-manager-7bb6fb86f5-w5qd9\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.276657 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-proxy-ca-bundles\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.276700 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/611bc243-950b-47fd-8341-f4a3ae40e27e-client-ca\") pod \"route-controller-manager-7bb6fb86f5-w5qd9\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.276384 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/611bc243-950b-47fd-8341-f4a3ae40e27e-config\") pod \"route-controller-manager-7bb6fb86f5-w5qd9\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.275788 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-client-ca\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.277439 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/611bc243-950b-47fd-8341-f4a3ae40e27e-client-ca\") pod \"route-controller-manager-7bb6fb86f5-w5qd9\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.278269 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-config\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.279173 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4250b20-1a8c-4c0c-88c0-2598c1e07503-serving-cert\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.279746 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-proxy-ca-bundles\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.282872 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611bc243-950b-47fd-8341-f4a3ae40e27e-serving-cert\") pod \"route-controller-manager-7bb6fb86f5-w5qd9\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.295000 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6tzq\" (UniqueName: \"kubernetes.io/projected/f4250b20-1a8c-4c0c-88c0-2598c1e07503-kube-api-access-v6tzq\") pod \"controller-manager-7d84cf4d9-8w4zr\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.298959 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq2ck\" (UniqueName: \"kubernetes.io/projected/611bc243-950b-47fd-8341-f4a3ae40e27e-kube-api-access-tq2ck\") pod \"route-controller-manager-7bb6fb86f5-w5qd9\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.303837 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.321009 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.377943 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:57 crc kubenswrapper[4937]: E0225 15:49:57.378160 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:57.878123264 +0000 UTC m=+248.891515154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.378400 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:57 crc kubenswrapper[4937]: E0225 15:49:57.378766 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:57.878752999 +0000 UTC m=+248.892144889 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.388967 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="430d304c-8623-4d01-a878-5db061d6a5b8" path="/var/lib/kubelet/pods/430d304c-8623-4d01-a878-5db061d6a5b8/volumes" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.389529 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf09db34-1df7-44a2-a584-a032476e4d66" path="/var/lib/kubelet/pods/bf09db34-1df7-44a2-a584-a032476e4d66/volumes" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.390056 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mrgxt"] Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.390932 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrgxt"] Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.391008 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.401093 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.420850 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.441926 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.484139 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.484372 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c753535-03f4-4888-8e28-43b4924726ae-catalog-content\") pod \"redhat-marketplace-mrgxt\" (UID: \"3c753535-03f4-4888-8e28-43b4924726ae\") " pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.484411 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c753535-03f4-4888-8e28-43b4924726ae-utilities\") pod \"redhat-marketplace-mrgxt\" (UID: \"3c753535-03f4-4888-8e28-43b4924726ae\") " pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.484575 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jjdr\" (UniqueName: \"kubernetes.io/projected/3c753535-03f4-4888-8e28-43b4924726ae-kube-api-access-2jjdr\") pod \"redhat-marketplace-mrgxt\" (UID: \"3c753535-03f4-4888-8e28-43b4924726ae\") " pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:49:57 crc kubenswrapper[4937]: E0225 15:49:57.485386 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:57.985368719 +0000 UTC m=+248.998760609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.561270 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsxxs"] Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.585560 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jjdr\" (UniqueName: \"kubernetes.io/projected/3c753535-03f4-4888-8e28-43b4924726ae-kube-api-access-2jjdr\") pod \"redhat-marketplace-mrgxt\" (UID: \"3c753535-03f4-4888-8e28-43b4924726ae\") " pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.585639 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c753535-03f4-4888-8e28-43b4924726ae-catalog-content\") pod \"redhat-marketplace-mrgxt\" (UID: \"3c753535-03f4-4888-8e28-43b4924726ae\") " pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.585670 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c753535-03f4-4888-8e28-43b4924726ae-utilities\") pod \"redhat-marketplace-mrgxt\" (UID: \"3c753535-03f4-4888-8e28-43b4924726ae\") " pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.585717 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:57 crc kubenswrapper[4937]: E0225 15:49:57.586042 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:58.086028785 +0000 UTC m=+249.099420675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.586227 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c753535-03f4-4888-8e28-43b4924726ae-catalog-content\") pod \"redhat-marketplace-mrgxt\" (UID: \"3c753535-03f4-4888-8e28-43b4924726ae\") " pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.587535 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c753535-03f4-4888-8e28-43b4924726ae-utilities\") pod \"redhat-marketplace-mrgxt\" (UID: \"3c753535-03f4-4888-8e28-43b4924726ae\") " pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.614652 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jjdr\" (UniqueName: \"kubernetes.io/projected/3c753535-03f4-4888-8e28-43b4924726ae-kube-api-access-2jjdr\") pod \"redhat-marketplace-mrgxt\" (UID: \"3c753535-03f4-4888-8e28-43b4924726ae\") " pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.631343 4937 ???:1] "http: TLS handshake error from 192.168.126.11:41970: no serving certificate available for the kubelet" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.687878 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:57 crc kubenswrapper[4937]: E0225 15:49:57.688073 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:58.188046274 +0000 UTC m=+249.201438164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.688164 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:57 crc kubenswrapper[4937]: E0225 15:49:57.688529 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:58.188520145 +0000 UTC m=+249.201912035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.714357 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.769184 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr"] Feb 25 15:49:57 crc kubenswrapper[4937]: W0225 15:49:57.777270 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4250b20_1a8c_4c0c_88c0_2598c1e07503.slice/crio-8fce08b94c06222ea50e752e2387d356cb7a26ee3a0fd505291babd73149428d WatchSource:0}: Error finding container 8fce08b94c06222ea50e752e2387d356cb7a26ee3a0fd505291babd73149428d: Status 404 returned error can't find the container with id 8fce08b94c06222ea50e752e2387d356cb7a26ee3a0fd505291babd73149428d Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.778186 4937 generic.go:334] "Generic (PLEG): container finished" podID="fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" containerID="b462f62dccd597bb271b38ef29f98d6f95a14725022e0cdf0eaa0dbf13e72c77" exitCode=0 Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.778248 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbgvv" event={"ID":"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9","Type":"ContainerDied","Data":"b462f62dccd597bb271b38ef29f98d6f95a14725022e0cdf0eaa0dbf13e72c77"} Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.779709 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b14c9c2b-b2a7-413e-b94d-be16cb50eeed","Type":"ContainerStarted","Data":"8c07a3d43325c96daabac8c582a7909f0a16a3b4728b0f6b01f88691066ec7be"} Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.781997 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsxxs" event={"ID":"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b","Type":"ContainerStarted","Data":"f60397d71bc30f5827caa56948c580a994e0fe5420e93840875a795c70c05e1b"} Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.793207 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:57 crc kubenswrapper[4937]: E0225 15:49:57.794031 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:58.294012398 +0000 UTC m=+249.307404288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.794103 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:57 crc kubenswrapper[4937]: E0225 15:49:57.795166 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:58.294559871 +0000 UTC m=+249.307951771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.895225 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:57 crc kubenswrapper[4937]: E0225 15:49:57.895752 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:58.395734799 +0000 UTC m=+249.409126699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.933049 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9"] Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.945740 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrgxt"] Feb 25 15:49:57 crc kubenswrapper[4937]: W0225 15:49:57.948586 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod611bc243_950b_47fd_8341_f4a3ae40e27e.slice/crio-3e23b3929af50ed0b59e8afb5ccaf24a28719ecd68cc3cb34075bd19c5629409 WatchSource:0}: Error finding container 3e23b3929af50ed0b59e8afb5ccaf24a28719ecd68cc3cb34075bd19c5629409: Status 404 returned error can't find the container with id 3e23b3929af50ed0b59e8afb5ccaf24a28719ecd68cc3cb34075bd19c5629409 Feb 25 15:49:57 crc kubenswrapper[4937]: W0225 15:49:57.956633 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c753535_03f4_4888_8e28_43b4924726ae.slice/crio-cd824703907873b1cc669f0f863cfefa256aecd170d5d4b8daaed22f9a84a8d2 WatchSource:0}: Error finding container cd824703907873b1cc669f0f863cfefa256aecd170d5d4b8daaed22f9a84a8d2: Status 404 returned error can't find the container with id cd824703907873b1cc669f0f863cfefa256aecd170d5d4b8daaed22f9a84a8d2 Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.963861 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-scndr"] Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.965177 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.968980 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.972176 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-scndr"] Feb 25 15:49:57 crc kubenswrapper[4937]: I0225 15:49:57.997749 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:57 crc kubenswrapper[4937]: E0225 15:49:57.998159 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:58.498137028 +0000 UTC m=+249.511529108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.052585 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:49:58 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:49:58 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:49:58 crc kubenswrapper[4937]: healthz check failed Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.052688 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.099308 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:58 crc kubenswrapper[4937]: E0225 15:49:58.099514 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:58.599455549 +0000 UTC m=+249.612847439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.100640 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgc6d\" (UniqueName: \"kubernetes.io/projected/dc970acf-3cdb-4951-8f35-705ce003550f-kube-api-access-sgc6d\") pod \"redhat-operators-scndr\" (UID: \"dc970acf-3cdb-4951-8f35-705ce003550f\") " pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.100834 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc970acf-3cdb-4951-8f35-705ce003550f-catalog-content\") pod \"redhat-operators-scndr\" (UID: \"dc970acf-3cdb-4951-8f35-705ce003550f\") " pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.100912 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.101037 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc970acf-3cdb-4951-8f35-705ce003550f-utilities\") pod \"redhat-operators-scndr\" (UID: \"dc970acf-3cdb-4951-8f35-705ce003550f\") " pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:49:58 crc kubenswrapper[4937]: E0225 15:49:58.101688 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:58.601665623 +0000 UTC m=+249.615057503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.158568 4937 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.202167 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:58 crc kubenswrapper[4937]: E0225 15:49:58.202353 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:58.702329709 +0000 UTC m=+249.715721609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.202764 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc970acf-3cdb-4951-8f35-705ce003550f-utilities\") pod \"redhat-operators-scndr\" (UID: \"dc970acf-3cdb-4951-8f35-705ce003550f\") " pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.202886 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgc6d\" (UniqueName: \"kubernetes.io/projected/dc970acf-3cdb-4951-8f35-705ce003550f-kube-api-access-sgc6d\") pod \"redhat-operators-scndr\" (UID: \"dc970acf-3cdb-4951-8f35-705ce003550f\") " pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.202985 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc970acf-3cdb-4951-8f35-705ce003550f-catalog-content\") pod \"redhat-operators-scndr\" (UID: \"dc970acf-3cdb-4951-8f35-705ce003550f\") " pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.203064 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:58 crc kubenswrapper[4937]: E0225 15:49:58.203335 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:58.703327333 +0000 UTC m=+249.716719223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.203648 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc970acf-3cdb-4951-8f35-705ce003550f-catalog-content\") pod \"redhat-operators-scndr\" (UID: \"dc970acf-3cdb-4951-8f35-705ce003550f\") " pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.203945 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc970acf-3cdb-4951-8f35-705ce003550f-utilities\") pod \"redhat-operators-scndr\" (UID: \"dc970acf-3cdb-4951-8f35-705ce003550f\") " pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.251706 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgc6d\" (UniqueName: \"kubernetes.io/projected/dc970acf-3cdb-4951-8f35-705ce003550f-kube-api-access-sgc6d\") pod \"redhat-operators-scndr\" (UID: \"dc970acf-3cdb-4951-8f35-705ce003550f\") " pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.290450 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.303988 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:58 crc kubenswrapper[4937]: E0225 15:49:58.304380 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:58.804366018 +0000 UTC m=+249.817757898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.365371 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l9tlm"] Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.366459 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.387242 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l9tlm"] Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.418661 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:58 crc kubenswrapper[4937]: E0225 15:49:58.418965 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:58.91895128 +0000 UTC m=+249.932343170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.520215 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.520534 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fae7d336-b701-4174-bdae-bd3f1bc032b1-utilities\") pod \"redhat-operators-l9tlm\" (UID: \"fae7d336-b701-4174-bdae-bd3f1bc032b1\") " pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.520602 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4wwh\" (UniqueName: \"kubernetes.io/projected/fae7d336-b701-4174-bdae-bd3f1bc032b1-kube-api-access-b4wwh\") pod \"redhat-operators-l9tlm\" (UID: \"fae7d336-b701-4174-bdae-bd3f1bc032b1\") " pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.520696 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fae7d336-b701-4174-bdae-bd3f1bc032b1-catalog-content\") pod \"redhat-operators-l9tlm\" (UID: \"fae7d336-b701-4174-bdae-bd3f1bc032b1\") " pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:49:58 crc kubenswrapper[4937]: E0225 15:49:58.521669 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:59.021642485 +0000 UTC m=+250.035034375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.609407 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-scndr"] Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.622042 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fae7d336-b701-4174-bdae-bd3f1bc032b1-catalog-content\") pod \"redhat-operators-l9tlm\" (UID: \"fae7d336-b701-4174-bdae-bd3f1bc032b1\") " pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.622165 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fae7d336-b701-4174-bdae-bd3f1bc032b1-utilities\") pod \"redhat-operators-l9tlm\" (UID: \"fae7d336-b701-4174-bdae-bd3f1bc032b1\") " pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.622204 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4wwh\" (UniqueName: \"kubernetes.io/projected/fae7d336-b701-4174-bdae-bd3f1bc032b1-kube-api-access-b4wwh\") pod \"redhat-operators-l9tlm\" (UID: \"fae7d336-b701-4174-bdae-bd3f1bc032b1\") " pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.622244 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:58 crc kubenswrapper[4937]: E0225 15:49:58.622501 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 15:49:59.122476085 +0000 UTC m=+250.135867965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pdnqk" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.622544 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fae7d336-b701-4174-bdae-bd3f1bc032b1-catalog-content\") pod \"redhat-operators-l9tlm\" (UID: \"fae7d336-b701-4174-bdae-bd3f1bc032b1\") " pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.622902 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fae7d336-b701-4174-bdae-bd3f1bc032b1-utilities\") pod \"redhat-operators-l9tlm\" (UID: \"fae7d336-b701-4174-bdae-bd3f1bc032b1\") " pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.647520 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4wwh\" (UniqueName: \"kubernetes.io/projected/fae7d336-b701-4174-bdae-bd3f1bc032b1-kube-api-access-b4wwh\") pod \"redhat-operators-l9tlm\" (UID: \"fae7d336-b701-4174-bdae-bd3f1bc032b1\") " pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.675329 4937 patch_prober.go:28] interesting pod/apiserver-76f77b778f-7drd4 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 25 15:49:58 crc kubenswrapper[4937]: [+]log ok Feb 25 15:49:58 crc kubenswrapper[4937]: [+]etcd ok Feb 25 15:49:58 crc kubenswrapper[4937]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 25 15:49:58 crc kubenswrapper[4937]: [+]poststarthook/generic-apiserver-start-informers ok Feb 25 15:49:58 crc kubenswrapper[4937]: [+]poststarthook/max-in-flight-filter ok Feb 25 15:49:58 crc kubenswrapper[4937]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 25 15:49:58 crc kubenswrapper[4937]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 25 15:49:58 crc kubenswrapper[4937]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 25 15:49:58 crc kubenswrapper[4937]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 25 15:49:58 crc kubenswrapper[4937]: [+]poststarthook/project.openshift.io-projectcache ok Feb 25 15:49:58 crc kubenswrapper[4937]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 25 15:49:58 crc kubenswrapper[4937]: [+]poststarthook/openshift.io-startinformers ok Feb 25 15:49:58 crc kubenswrapper[4937]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 25 15:49:58 crc kubenswrapper[4937]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 25 15:49:58 crc kubenswrapper[4937]: livez check failed Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.675446 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-7drd4" podUID="0f9fc900-6cf7-4890-8e7d-6925e9e3862f" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.723594 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:58 crc kubenswrapper[4937]: E0225 15:49:58.724181 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 15:49:59.224162256 +0000 UTC m=+250.237554146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.731673 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.743647 4937 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-25T15:49:58.158782075Z","Handler":null,"Name":""} Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.750247 4937 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.750289 4937 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.797411 4937 generic.go:334] "Generic (PLEG): container finished" podID="3418c5c9-1699-469e-8fbb-78a8bce52af3" containerID="fc877140c6984b4c809dc83014dc6da84f97078edd193bfb90d334e3626ee015" exitCode=0 Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.797869 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3418c5c9-1699-469e-8fbb-78a8bce52af3","Type":"ContainerDied","Data":"fc877140c6984b4c809dc83014dc6da84f97078edd193bfb90d334e3626ee015"} Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.820129 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-48pnj" event={"ID":"816999fe-cb2a-4f9b-b546-ca866b5aec3a","Type":"ContainerStarted","Data":"bcc2ca41f232429cfb46703095b3301ebe95eb4d7d277d02007ebc61e084b72e"} Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.825279 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.826083 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b14c9c2b-b2a7-413e-b94d-be16cb50eeed","Type":"ContainerStarted","Data":"4eebd7e9024529f54ad60ddf65edd82aec08a183e6ad3c5c9338d35eba96c182"} Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.841898 4937 generic.go:334] "Generic (PLEG): container finished" podID="3c753535-03f4-4888-8e28-43b4924726ae" containerID="50afffbc2a0c0f468eb8f02c16a9bfebf9b9c4158782ac959bac67a586586f9c" exitCode=0 Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.841968 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrgxt" event={"ID":"3c753535-03f4-4888-8e28-43b4924726ae","Type":"ContainerDied","Data":"50afffbc2a0c0f468eb8f02c16a9bfebf9b9c4158782ac959bac67a586586f9c"} Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.841994 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrgxt" event={"ID":"3c753535-03f4-4888-8e28-43b4924726ae","Type":"ContainerStarted","Data":"cd824703907873b1cc669f0f863cfefa256aecd170d5d4b8daaed22f9a84a8d2"} Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.847375 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" event={"ID":"f4250b20-1a8c-4c0c-88c0-2598c1e07503","Type":"ContainerStarted","Data":"3eb1fefb0cabbd7a0e8143606ae4740aee2fcecba90cf22e30a19282ad17b6dd"} Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.847437 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" event={"ID":"f4250b20-1a8c-4c0c-88c0-2598c1e07503","Type":"ContainerStarted","Data":"8fce08b94c06222ea50e752e2387d356cb7a26ee3a0fd505291babd73149428d"} Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.848594 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.850785 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.85075925 podStartE2EDuration="2.85075925s" podCreationTimestamp="2026-02-25 15:49:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:58.848597487 +0000 UTC m=+249.861989377" watchObservedRunningTime="2026-02-25 15:49:58.85075925 +0000 UTC m=+249.864151140" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.866239 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" event={"ID":"611bc243-950b-47fd-8341-f4a3ae40e27e","Type":"ContainerStarted","Data":"c78611f4b28b41f99492fc1502eb7e5f76b268b0ecf133a509992888b71b01bc"} Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.866295 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" event={"ID":"611bc243-950b-47fd-8341-f4a3ae40e27e","Type":"ContainerStarted","Data":"3e23b3929af50ed0b59e8afb5ccaf24a28719ecd68cc3cb34075bd19c5629409"} Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.867075 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.880011 4937 generic.go:334] "Generic (PLEG): container finished" podID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" containerID="87fd06301df005ac0266157675a57d8f32c754d569794513d573076fc4940d4c" exitCode=0 Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.880169 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsxxs" event={"ID":"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b","Type":"ContainerDied","Data":"87fd06301df005ac0266157675a57d8f32c754d569794513d573076fc4940d4c"} Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.880238 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.889700 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.902011 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" podStartSLOduration=3.901995869 podStartE2EDuration="3.901995869s" podCreationTimestamp="2026-02-25 15:49:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:58.901173689 +0000 UTC m=+249.914565589" watchObservedRunningTime="2026-02-25 15:49:58.901995869 +0000 UTC m=+249.915387759" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.917766 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.917833 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:58 crc kubenswrapper[4937]: I0225 15:49:58.984463 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" podStartSLOduration=3.9844423239999998 podStartE2EDuration="3.984442324s" podCreationTimestamp="2026-02-25 15:49:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:58.926518143 +0000 UTC m=+249.939910033" watchObservedRunningTime="2026-02-25 15:49:58.984442324 +0000 UTC m=+249.997834214" Feb 25 15:49:59 crc kubenswrapper[4937]: I0225 15:49:59.059755 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:49:59 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:49:59 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:49:59 crc kubenswrapper[4937]: healthz check failed Feb 25 15:49:59 crc kubenswrapper[4937]: I0225 15:49:59.059847 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:49:59 crc kubenswrapper[4937]: I0225 15:49:59.087023 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pdnqk\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:59 crc kubenswrapper[4937]: I0225 15:49:59.130210 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 15:49:59 crc kubenswrapper[4937]: I0225 15:49:59.137032 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 25 15:49:59 crc kubenswrapper[4937]: I0225 15:49:59.333537 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 25 15:49:59 crc kubenswrapper[4937]: I0225 15:49:59.342805 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:49:59 crc kubenswrapper[4937]: I0225 15:49:59.377517 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 25 15:49:59 crc kubenswrapper[4937]: I0225 15:49:59.418404 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9nd26" Feb 25 15:49:59 crc kubenswrapper[4937]: I0225 15:49:59.887054 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-48pnj" event={"ID":"816999fe-cb2a-4f9b-b546-ca866b5aec3a","Type":"ContainerStarted","Data":"ef7dd2cd216f645aa76158da65d55f4b47ef6df11e23a412dce49817e064621e"} Feb 25 15:49:59 crc kubenswrapper[4937]: I0225 15:49:59.888348 4937 generic.go:334] "Generic (PLEG): container finished" podID="b14c9c2b-b2a7-413e-b94d-be16cb50eeed" containerID="4eebd7e9024529f54ad60ddf65edd82aec08a183e6ad3c5c9338d35eba96c182" exitCode=0 Feb 25 15:49:59 crc kubenswrapper[4937]: I0225 15:49:59.888449 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b14c9c2b-b2a7-413e-b94d-be16cb50eeed","Type":"ContainerDied","Data":"4eebd7e9024529f54ad60ddf65edd82aec08a183e6ad3c5c9338d35eba96c182"} Feb 25 15:49:59 crc kubenswrapper[4937]: I0225 15:49:59.908879 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-48pnj" podStartSLOduration=16.908857305 podStartE2EDuration="16.908857305s" podCreationTimestamp="2026-02-25 15:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:49:59.906430456 +0000 UTC m=+250.919822346" watchObservedRunningTime="2026-02-25 15:49:59.908857305 +0000 UTC m=+250.922249205" Feb 25 15:50:00 crc kubenswrapper[4937]: I0225 15:50:00.057796 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:50:00 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:50:00 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:50:00 crc kubenswrapper[4937]: healthz check failed Feb 25 15:50:00 crc kubenswrapper[4937]: I0225 15:50:00.057859 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:50:00 crc kubenswrapper[4937]: I0225 15:50:00.123818 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533910-n4bgf"] Feb 25 15:50:00 crc kubenswrapper[4937]: I0225 15:50:00.124472 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533910-n4bgf" Feb 25 15:50:00 crc kubenswrapper[4937]: I0225 15:50:00.126180 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 15:50:00 crc kubenswrapper[4937]: I0225 15:50:00.137994 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533910-n4bgf"] Feb 25 15:50:00 crc kubenswrapper[4937]: I0225 15:50:00.246294 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krm9b\" (UniqueName: \"kubernetes.io/projected/6e6e93a3-6673-464f-84a3-5585a6cbc0a8-kube-api-access-krm9b\") pod \"auto-csr-approver-29533910-n4bgf\" (UID: \"6e6e93a3-6673-464f-84a3-5585a6cbc0a8\") " pod="openshift-infra/auto-csr-approver-29533910-n4bgf" Feb 25 15:50:00 crc kubenswrapper[4937]: I0225 15:50:00.348048 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krm9b\" (UniqueName: \"kubernetes.io/projected/6e6e93a3-6673-464f-84a3-5585a6cbc0a8-kube-api-access-krm9b\") pod \"auto-csr-approver-29533910-n4bgf\" (UID: \"6e6e93a3-6673-464f-84a3-5585a6cbc0a8\") " pod="openshift-infra/auto-csr-approver-29533910-n4bgf" Feb 25 15:50:00 crc kubenswrapper[4937]: I0225 15:50:00.369544 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krm9b\" (UniqueName: \"kubernetes.io/projected/6e6e93a3-6673-464f-84a3-5585a6cbc0a8-kube-api-access-krm9b\") pod \"auto-csr-approver-29533910-n4bgf\" (UID: \"6e6e93a3-6673-464f-84a3-5585a6cbc0a8\") " pod="openshift-infra/auto-csr-approver-29533910-n4bgf" Feb 25 15:50:00 crc kubenswrapper[4937]: I0225 15:50:00.448598 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533910-n4bgf" Feb 25 15:50:01 crc kubenswrapper[4937]: I0225 15:50:01.050620 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:50:01 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:50:01 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:50:01 crc kubenswrapper[4937]: healthz check failed Feb 25 15:50:01 crc kubenswrapper[4937]: I0225 15:50:01.050685 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:50:01 crc kubenswrapper[4937]: I0225 15:50:01.446397 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:50:01 crc kubenswrapper[4937]: I0225 15:50:01.451089 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-7drd4" Feb 25 15:50:01 crc kubenswrapper[4937]: I0225 15:50:01.596785 4937 ???:1] "http: TLS handshake error from 192.168.126.11:58720: no serving certificate available for the kubelet" Feb 25 15:50:02 crc kubenswrapper[4937]: I0225 15:50:02.053030 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:50:02 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:50:02 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:50:02 crc kubenswrapper[4937]: healthz check failed Feb 25 15:50:02 crc kubenswrapper[4937]: I0225 15:50:02.053329 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:50:02 crc kubenswrapper[4937]: I0225 15:50:02.489552 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs\") pod \"network-metrics-daemon-sz7zh\" (UID: \"f125006f-2b26-4ffe-ac0d-dc756f48b067\") " pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:50:02 crc kubenswrapper[4937]: I0225 15:50:02.491773 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 25 15:50:02 crc kubenswrapper[4937]: I0225 15:50:02.517293 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f125006f-2b26-4ffe-ac0d-dc756f48b067-metrics-certs\") pod \"network-metrics-daemon-sz7zh\" (UID: \"f125006f-2b26-4ffe-ac0d-dc756f48b067\") " pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:50:02 crc kubenswrapper[4937]: I0225 15:50:02.748110 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 25 15:50:02 crc kubenswrapper[4937]: I0225 15:50:02.756458 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sz7zh" Feb 25 15:50:02 crc kubenswrapper[4937]: I0225 15:50:02.772579 4937 ???:1] "http: TLS handshake error from 192.168.126.11:58728: no serving certificate available for the kubelet" Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.050775 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:50:03 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:50:03 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:50:03 crc kubenswrapper[4937]: healthz check failed Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.050835 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:50:03 crc kubenswrapper[4937]: W0225 15:50:03.778068 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc970acf_3cdb_4951_8f35_705ce003550f.slice/crio-eedbc6a8629695a67983066428f03439464cbbb8a703e13a57024be745db089a WatchSource:0}: Error finding container eedbc6a8629695a67983066428f03439464cbbb8a703e13a57024be745db089a: Status 404 returned error can't find the container with id eedbc6a8629695a67983066428f03439464cbbb8a703e13a57024be745db089a Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.819756 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.826617 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.911182 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3418c5c9-1699-469e-8fbb-78a8bce52af3-kubelet-dir\") pod \"3418c5c9-1699-469e-8fbb-78a8bce52af3\" (UID: \"3418c5c9-1699-469e-8fbb-78a8bce52af3\") " Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.911263 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3418c5c9-1699-469e-8fbb-78a8bce52af3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3418c5c9-1699-469e-8fbb-78a8bce52af3" (UID: "3418c5c9-1699-469e-8fbb-78a8bce52af3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.911344 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3418c5c9-1699-469e-8fbb-78a8bce52af3-kube-api-access\") pod \"3418c5c9-1699-469e-8fbb-78a8bce52af3\" (UID: \"3418c5c9-1699-469e-8fbb-78a8bce52af3\") " Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.911387 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b14c9c2b-b2a7-413e-b94d-be16cb50eeed-kubelet-dir\") pod \"b14c9c2b-b2a7-413e-b94d-be16cb50eeed\" (UID: \"b14c9c2b-b2a7-413e-b94d-be16cb50eeed\") " Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.911412 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b14c9c2b-b2a7-413e-b94d-be16cb50eeed-kube-api-access\") pod \"b14c9c2b-b2a7-413e-b94d-be16cb50eeed\" (UID: \"b14c9c2b-b2a7-413e-b94d-be16cb50eeed\") " Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.911505 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b14c9c2b-b2a7-413e-b94d-be16cb50eeed-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b14c9c2b-b2a7-413e-b94d-be16cb50eeed" (UID: "b14c9c2b-b2a7-413e-b94d-be16cb50eeed"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.912215 4937 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3418c5c9-1699-469e-8fbb-78a8bce52af3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.912241 4937 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b14c9c2b-b2a7-413e-b94d-be16cb50eeed-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.915713 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scndr" event={"ID":"dc970acf-3cdb-4951-8f35-705ce003550f","Type":"ContainerStarted","Data":"eedbc6a8629695a67983066428f03439464cbbb8a703e13a57024be745db089a"} Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.916445 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3418c5c9-1699-469e-8fbb-78a8bce52af3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3418c5c9-1699-469e-8fbb-78a8bce52af3" (UID: "3418c5c9-1699-469e-8fbb-78a8bce52af3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.916582 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b14c9c2b-b2a7-413e-b94d-be16cb50eeed-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b14c9c2b-b2a7-413e-b94d-be16cb50eeed" (UID: "b14c9c2b-b2a7-413e-b94d-be16cb50eeed"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.917185 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.925398 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3418c5c9-1699-469e-8fbb-78a8bce52af3","Type":"ContainerDied","Data":"0de9bb4a60d57e9fcd417cebf955f36be45a750dd13e90727504d8729a4f99fd"} Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.925435 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0de9bb4a60d57e9fcd417cebf955f36be45a750dd13e90727504d8729a4f99fd" Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.928010 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b14c9c2b-b2a7-413e-b94d-be16cb50eeed","Type":"ContainerDied","Data":"8c07a3d43325c96daabac8c582a7909f0a16a3b4728b0f6b01f88691066ec7be"} Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.928042 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c07a3d43325c96daabac8c582a7909f0a16a3b4728b0f6b01f88691066ec7be" Feb 25 15:50:03 crc kubenswrapper[4937]: I0225 15:50:03.928047 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 15:50:04 crc kubenswrapper[4937]: I0225 15:50:04.014022 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3418c5c9-1699-469e-8fbb-78a8bce52af3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 15:50:04 crc kubenswrapper[4937]: I0225 15:50:04.014057 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b14c9c2b-b2a7-413e-b94d-be16cb50eeed-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 15:50:04 crc kubenswrapper[4937]: I0225 15:50:04.050912 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:50:04 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:50:04 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:50:04 crc kubenswrapper[4937]: healthz check failed Feb 25 15:50:04 crc kubenswrapper[4937]: I0225 15:50:04.051017 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:50:05 crc kubenswrapper[4937]: I0225 15:50:05.050228 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:50:05 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:50:05 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:50:05 crc kubenswrapper[4937]: healthz check failed Feb 25 15:50:05 crc kubenswrapper[4937]: I0225 15:50:05.050521 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:50:05 crc kubenswrapper[4937]: I0225 15:50:05.832630 4937 patch_prober.go:28] interesting pod/console-f9d7485db-djs85 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 25 15:50:05 crc kubenswrapper[4937]: I0225 15:50:05.833159 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-djs85" podUID="ff089f24-3d05-4c97-b6f7-3a39cbec049f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 25 15:50:05 crc kubenswrapper[4937]: I0225 15:50:05.965963 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l9tlm"] Feb 25 15:50:05 crc kubenswrapper[4937]: W0225 15:50:05.976524 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfae7d336_b701_4174_bdae_bd3f1bc032b1.slice/crio-038aefc5b55abeda87c84981b6e0dcc15d183b49a8313d7edb8541461b4cafbe WatchSource:0}: Error finding container 038aefc5b55abeda87c84981b6e0dcc15d183b49a8313d7edb8541461b4cafbe: Status 404 returned error can't find the container with id 038aefc5b55abeda87c84981b6e0dcc15d183b49a8313d7edb8541461b4cafbe Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.052667 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:50:06 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:50:06 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:50:06 crc kubenswrapper[4937]: healthz check failed Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.052757 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.093048 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.093133 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.093299 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.093317 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.183408 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533910-n4bgf"] Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.195923 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pdnqk"] Feb 25 15:50:06 crc kubenswrapper[4937]: W0225 15:50:06.196227 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e6e93a3_6673_464f_84a3_5585a6cbc0a8.slice/crio-4820e8c1cb17bdca4851c2b966ae4968e408f258e779cddaac92c9744df71df4 WatchSource:0}: Error finding container 4820e8c1cb17bdca4851c2b966ae4968e408f258e779cddaac92c9744df71df4: Status 404 returned error can't find the container with id 4820e8c1cb17bdca4851c2b966ae4968e408f258e779cddaac92c9744df71df4 Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.199377 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sz7zh"] Feb 25 15:50:06 crc kubenswrapper[4937]: W0225 15:50:06.200404 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf125006f_2b26_4ffe_ac0d_dc756f48b067.slice/crio-f5415e09a080c5c4a62467d47efa2faf01992bd71fc908771c069a4b024f003c WatchSource:0}: Error finding container f5415e09a080c5c4a62467d47efa2faf01992bd71fc908771c069a4b024f003c: Status 404 returned error can't find the container with id f5415e09a080c5c4a62467d47efa2faf01992bd71fc908771c069a4b024f003c Feb 25 15:50:06 crc kubenswrapper[4937]: W0225 15:50:06.205832 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod969c8ed5_2ea2_4966_9fd6_954ab7ef8ccf.slice/crio-00edcc4b3894954d32ede2a937efe6772921ea5d60c8fc1214cf6b53e06c50d5 WatchSource:0}: Error finding container 00edcc4b3894954d32ede2a937efe6772921ea5d60c8fc1214cf6b53e06c50d5: Status 404 returned error can't find the container with id 00edcc4b3894954d32ede2a937efe6772921ea5d60c8fc1214cf6b53e06c50d5 Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.945874 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533910-n4bgf" event={"ID":"6e6e93a3-6673-464f-84a3-5585a6cbc0a8","Type":"ContainerStarted","Data":"4820e8c1cb17bdca4851c2b966ae4968e408f258e779cddaac92c9744df71df4"} Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.947594 4937 generic.go:334] "Generic (PLEG): container finished" podID="fae7d336-b701-4174-bdae-bd3f1bc032b1" containerID="e29ca1b0226e4224c03df32df7ca5063937ef4b2b1d7b6b1b4f6a281aa2aaafe" exitCode=0 Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.947658 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9tlm" event={"ID":"fae7d336-b701-4174-bdae-bd3f1bc032b1","Type":"ContainerDied","Data":"e29ca1b0226e4224c03df32df7ca5063937ef4b2b1d7b6b1b4f6a281aa2aaafe"} Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.947688 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9tlm" event={"ID":"fae7d336-b701-4174-bdae-bd3f1bc032b1","Type":"ContainerStarted","Data":"038aefc5b55abeda87c84981b6e0dcc15d183b49a8313d7edb8541461b4cafbe"} Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.950157 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" event={"ID":"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf","Type":"ContainerStarted","Data":"72c109c539d2049284c53056ca9ffccf71c7af0d1ade28d88ac3eda8db25ba81"} Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.950185 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" event={"ID":"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf","Type":"ContainerStarted","Data":"00edcc4b3894954d32ede2a937efe6772921ea5d60c8fc1214cf6b53e06c50d5"} Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.952290 4937 generic.go:334] "Generic (PLEG): container finished" podID="dc970acf-3cdb-4951-8f35-705ce003550f" containerID="674a86c36c74adcc12046294564d7e1dc8bbf3dfe538e769d6cea8b801395692" exitCode=0 Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.952358 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scndr" event={"ID":"dc970acf-3cdb-4951-8f35-705ce003550f","Type":"ContainerDied","Data":"674a86c36c74adcc12046294564d7e1dc8bbf3dfe538e769d6cea8b801395692"} Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.954311 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" event={"ID":"f125006f-2b26-4ffe-ac0d-dc756f48b067","Type":"ContainerStarted","Data":"881bb5f3425d11da1b02774e2ce415439b0407121751fe8a7f7af627501a3d0f"} Feb 25 15:50:06 crc kubenswrapper[4937]: I0225 15:50:06.954402 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" event={"ID":"f125006f-2b26-4ffe-ac0d-dc756f48b067","Type":"ContainerStarted","Data":"f5415e09a080c5c4a62467d47efa2faf01992bd71fc908771c069a4b024f003c"} Feb 25 15:50:07 crc kubenswrapper[4937]: I0225 15:50:07.050398 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:50:07 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:50:07 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:50:07 crc kubenswrapper[4937]: healthz check failed Feb 25 15:50:07 crc kubenswrapper[4937]: I0225 15:50:07.050727 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:50:07 crc kubenswrapper[4937]: I0225 15:50:07.961123 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sz7zh" event={"ID":"f125006f-2b26-4ffe-ac0d-dc756f48b067","Type":"ContainerStarted","Data":"8a34d9c05118e1ad3081305f9636e5c167d94c524e54a2bfc0db1c5860c02698"} Feb 25 15:50:07 crc kubenswrapper[4937]: I0225 15:50:07.961247 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:50:07 crc kubenswrapper[4937]: I0225 15:50:07.979300 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" podStartSLOduration=206.979275722 podStartE2EDuration="3m26.979275722s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:50:07.97795431 +0000 UTC m=+258.991346210" watchObservedRunningTime="2026-02-25 15:50:07.979275722 +0000 UTC m=+258.992667622" Feb 25 15:50:08 crc kubenswrapper[4937]: I0225 15:50:08.052173 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:50:08 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:50:08 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:50:08 crc kubenswrapper[4937]: healthz check failed Feb 25 15:50:08 crc kubenswrapper[4937]: I0225 15:50:08.052459 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:50:09 crc kubenswrapper[4937]: I0225 15:50:09.052075 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:50:09 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:50:09 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:50:09 crc kubenswrapper[4937]: healthz check failed Feb 25 15:50:09 crc kubenswrapper[4937]: I0225 15:50:09.052187 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:50:10 crc kubenswrapper[4937]: I0225 15:50:10.050121 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:50:10 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:50:10 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:50:10 crc kubenswrapper[4937]: healthz check failed Feb 25 15:50:10 crc kubenswrapper[4937]: I0225 15:50:10.050176 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:50:11 crc kubenswrapper[4937]: I0225 15:50:11.051203 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:50:11 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:50:11 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:50:11 crc kubenswrapper[4937]: healthz check failed Feb 25 15:50:11 crc kubenswrapper[4937]: I0225 15:50:11.051745 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:50:11 crc kubenswrapper[4937]: I0225 15:50:11.390416 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sz7zh" podStartSLOduration=210.390379241 podStartE2EDuration="3m30.390379241s" podCreationTimestamp="2026-02-25 15:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:50:09.993590085 +0000 UTC m=+261.006981985" watchObservedRunningTime="2026-02-25 15:50:11.390379241 +0000 UTC m=+262.403771141" Feb 25 15:50:11 crc kubenswrapper[4937]: I0225 15:50:11.494692 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 15:50:11 crc kubenswrapper[4937]: I0225 15:50:11.494759 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 15:50:12 crc kubenswrapper[4937]: I0225 15:50:12.051153 4937 patch_prober.go:28] interesting pod/router-default-5444994796-q2wd7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 15:50:12 crc kubenswrapper[4937]: [-]has-synced failed: reason withheld Feb 25 15:50:12 crc kubenswrapper[4937]: [+]process-running ok Feb 25 15:50:12 crc kubenswrapper[4937]: healthz check failed Feb 25 15:50:12 crc kubenswrapper[4937]: I0225 15:50:12.051251 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-q2wd7" podUID="e21358c1-fad3-42c2-982d-8f3e50fadc34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:50:13 crc kubenswrapper[4937]: I0225 15:50:13.051171 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:50:13 crc kubenswrapper[4937]: I0225 15:50:13.055072 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-q2wd7" Feb 25 15:50:14 crc kubenswrapper[4937]: I0225 15:50:14.362759 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr"] Feb 25 15:50:14 crc kubenswrapper[4937]: I0225 15:50:14.363136 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" podUID="f4250b20-1a8c-4c0c-88c0-2598c1e07503" containerName="controller-manager" containerID="cri-o://3eb1fefb0cabbd7a0e8143606ae4740aee2fcecba90cf22e30a19282ad17b6dd" gracePeriod=30 Feb 25 15:50:14 crc kubenswrapper[4937]: I0225 15:50:14.381436 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9"] Feb 25 15:50:14 crc kubenswrapper[4937]: I0225 15:50:14.381674 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" podUID="611bc243-950b-47fd-8341-f4a3ae40e27e" containerName="route-controller-manager" containerID="cri-o://c78611f4b28b41f99492fc1502eb7e5f76b268b0ecf133a509992888b71b01bc" gracePeriod=30 Feb 25 15:50:15 crc kubenswrapper[4937]: I0225 15:50:15.839219 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:50:15 crc kubenswrapper[4937]: I0225 15:50:15.844442 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-djs85" Feb 25 15:50:16 crc kubenswrapper[4937]: I0225 15:50:16.007419 4937 generic.go:334] "Generic (PLEG): container finished" podID="f4250b20-1a8c-4c0c-88c0-2598c1e07503" containerID="3eb1fefb0cabbd7a0e8143606ae4740aee2fcecba90cf22e30a19282ad17b6dd" exitCode=0 Feb 25 15:50:16 crc kubenswrapper[4937]: I0225 15:50:16.007530 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" event={"ID":"f4250b20-1a8c-4c0c-88c0-2598c1e07503","Type":"ContainerDied","Data":"3eb1fefb0cabbd7a0e8143606ae4740aee2fcecba90cf22e30a19282ad17b6dd"} Feb 25 15:50:16 crc kubenswrapper[4937]: I0225 15:50:16.009357 4937 generic.go:334] "Generic (PLEG): container finished" podID="611bc243-950b-47fd-8341-f4a3ae40e27e" containerID="c78611f4b28b41f99492fc1502eb7e5f76b268b0ecf133a509992888b71b01bc" exitCode=0 Feb 25 15:50:16 crc kubenswrapper[4937]: I0225 15:50:16.009414 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" event={"ID":"611bc243-950b-47fd-8341-f4a3ae40e27e","Type":"ContainerDied","Data":"c78611f4b28b41f99492fc1502eb7e5f76b268b0ecf133a509992888b71b01bc"} Feb 25 15:50:16 crc kubenswrapper[4937]: I0225 15:50:16.093225 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:50:16 crc kubenswrapper[4937]: I0225 15:50:16.093280 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:50:16 crc kubenswrapper[4937]: I0225 15:50:16.093325 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-vd8vf" Feb 25 15:50:16 crc kubenswrapper[4937]: I0225 15:50:16.093946 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"dae24351144e2f191f51ee803738b86bb804b445acd2efd21fb0d1ae41d66e7f"} pod="openshift-console/downloads-7954f5f757-vd8vf" containerMessage="Container download-server failed liveness probe, will be restarted" Feb 25 15:50:16 crc kubenswrapper[4937]: I0225 15:50:16.093975 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" containerID="cri-o://dae24351144e2f191f51ee803738b86bb804b445acd2efd21fb0d1ae41d66e7f" gracePeriod=2 Feb 25 15:50:16 crc kubenswrapper[4937]: I0225 15:50:16.094435 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:50:16 crc kubenswrapper[4937]: I0225 15:50:16.094456 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:50:16 crc kubenswrapper[4937]: I0225 15:50:16.094753 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:50:16 crc kubenswrapper[4937]: I0225 15:50:16.094799 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:50:17 crc kubenswrapper[4937]: I0225 15:50:17.402304 4937 patch_prober.go:28] interesting pod/controller-manager-7d84cf4d9-8w4zr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Feb 25 15:50:17 crc kubenswrapper[4937]: I0225 15:50:17.402380 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" podUID="f4250b20-1a8c-4c0c-88c0-2598c1e07503" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Feb 25 15:50:17 crc kubenswrapper[4937]: I0225 15:50:17.422553 4937 patch_prober.go:28] interesting pod/route-controller-manager-7bb6fb86f5-w5qd9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Feb 25 15:50:17 crc kubenswrapper[4937]: I0225 15:50:17.422644 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" podUID="611bc243-950b-47fd-8341-f4a3ae40e27e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Feb 25 15:50:19 crc kubenswrapper[4937]: I0225 15:50:19.035757 4937 generic.go:334] "Generic (PLEG): container finished" podID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerID="dae24351144e2f191f51ee803738b86bb804b445acd2efd21fb0d1ae41d66e7f" exitCode=0 Feb 25 15:50:19 crc kubenswrapper[4937]: I0225 15:50:19.035821 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vd8vf" event={"ID":"d9c49432-4c74-4842-bdd2-880414a4ad0a","Type":"ContainerDied","Data":"dae24351144e2f191f51ee803738b86bb804b445acd2efd21fb0d1ae41d66e7f"} Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:23.285789 4937 ???:1] "http: TLS handshake error from 192.168.126.11:56948: no serving certificate available for the kubelet" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:26.092302 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:26.092354 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:26.424759 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mzv4j" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:28.402466 4937 patch_prober.go:28] interesting pod/controller-manager-7d84cf4d9-8w4zr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:28.402804 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" podUID="f4250b20-1a8c-4c0c-88c0-2598c1e07503" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:28.421955 4937 patch_prober.go:28] interesting pod/route-controller-manager-7bb6fb86f5-w5qd9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:28.422039 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" podUID="611bc243-950b-47fd-8341-f4a3ae40e27e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:29.348540 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.272578 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 25 15:50:31 crc kubenswrapper[4937]: E0225 15:50:30.273995 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3418c5c9-1699-469e-8fbb-78a8bce52af3" containerName="pruner" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.274038 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="3418c5c9-1699-469e-8fbb-78a8bce52af3" containerName="pruner" Feb 25 15:50:31 crc kubenswrapper[4937]: E0225 15:50:30.274069 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b14c9c2b-b2a7-413e-b94d-be16cb50eeed" containerName="pruner" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.274084 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14c9c2b-b2a7-413e-b94d-be16cb50eeed" containerName="pruner" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.274295 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="3418c5c9-1699-469e-8fbb-78a8bce52af3" containerName="pruner" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.274323 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="b14c9c2b-b2a7-413e-b94d-be16cb50eeed" containerName="pruner" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.275042 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.278391 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.278833 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.283448 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.436154 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3441d1f7-f511-4231-8dc1-a522a8e572e8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3441d1f7-f511-4231-8dc1-a522a8e572e8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.436261 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3441d1f7-f511-4231-8dc1-a522a8e572e8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3441d1f7-f511-4231-8dc1-a522a8e572e8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.537294 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3441d1f7-f511-4231-8dc1-a522a8e572e8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3441d1f7-f511-4231-8dc1-a522a8e572e8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.537390 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3441d1f7-f511-4231-8dc1-a522a8e572e8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3441d1f7-f511-4231-8dc1-a522a8e572e8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.537432 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3441d1f7-f511-4231-8dc1-a522a8e572e8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3441d1f7-f511-4231-8dc1-a522a8e572e8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.564297 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3441d1f7-f511-4231-8dc1-a522a8e572e8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3441d1f7-f511-4231-8dc1-a522a8e572e8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 15:50:31 crc kubenswrapper[4937]: I0225 15:50:30.595558 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 15:50:35 crc kubenswrapper[4937]: I0225 15:50:35.263177 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 25 15:50:35 crc kubenswrapper[4937]: I0225 15:50:35.264595 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 25 15:50:35 crc kubenswrapper[4937]: I0225 15:50:35.278121 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 25 15:50:35 crc kubenswrapper[4937]: I0225 15:50:35.321410 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8624c64-7d59-4173-8928-e7dff50f1039-var-lock\") pod \"installer-9-crc\" (UID: \"e8624c64-7d59-4173-8928-e7dff50f1039\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 15:50:35 crc kubenswrapper[4937]: I0225 15:50:35.321472 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8624c64-7d59-4173-8928-e7dff50f1039-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e8624c64-7d59-4173-8928-e7dff50f1039\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 15:50:35 crc kubenswrapper[4937]: I0225 15:50:35.321532 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8624c64-7d59-4173-8928-e7dff50f1039-kube-api-access\") pod \"installer-9-crc\" (UID: \"e8624c64-7d59-4173-8928-e7dff50f1039\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 15:50:35 crc kubenswrapper[4937]: I0225 15:50:35.423067 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8624c64-7d59-4173-8928-e7dff50f1039-var-lock\") pod \"installer-9-crc\" (UID: \"e8624c64-7d59-4173-8928-e7dff50f1039\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 15:50:35 crc kubenswrapper[4937]: I0225 15:50:35.423179 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8624c64-7d59-4173-8928-e7dff50f1039-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e8624c64-7d59-4173-8928-e7dff50f1039\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 15:50:35 crc kubenswrapper[4937]: I0225 15:50:35.423243 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8624c64-7d59-4173-8928-e7dff50f1039-kube-api-access\") pod \"installer-9-crc\" (UID: \"e8624c64-7d59-4173-8928-e7dff50f1039\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 15:50:35 crc kubenswrapper[4937]: I0225 15:50:35.423245 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8624c64-7d59-4173-8928-e7dff50f1039-var-lock\") pod \"installer-9-crc\" (UID: \"e8624c64-7d59-4173-8928-e7dff50f1039\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 15:50:35 crc kubenswrapper[4937]: I0225 15:50:35.423387 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8624c64-7d59-4173-8928-e7dff50f1039-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e8624c64-7d59-4173-8928-e7dff50f1039\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 15:50:35 crc kubenswrapper[4937]: I0225 15:50:35.454715 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8624c64-7d59-4173-8928-e7dff50f1039-kube-api-access\") pod \"installer-9-crc\" (UID: \"e8624c64-7d59-4173-8928-e7dff50f1039\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 15:50:35 crc kubenswrapper[4937]: I0225 15:50:35.591241 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 25 15:50:35 crc kubenswrapper[4937]: E0225 15:50:35.775907 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:e2eed51552d2dd4a9fa9dd28b6fcc63e47b2dccaa11e468c8a90b1a36ac80c15: Get \"https://registry.redhat.io/v2/redhat/certified-operator-index/blobs/sha256:e2eed51552d2dd4a9fa9dd28b6fcc63e47b2dccaa11e468c8a90b1a36ac80c15\": context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 25 15:50:35 crc kubenswrapper[4937]: E0225 15:50:35.776160 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zs9zp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-c4b95_openshift-marketplace(2506466d-db79-4b0e-a2df-2d64c56ad7cd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:e2eed51552d2dd4a9fa9dd28b6fcc63e47b2dccaa11e468c8a90b1a36ac80c15: Get \"https://registry.redhat.io/v2/redhat/certified-operator-index/blobs/sha256:e2eed51552d2dd4a9fa9dd28b6fcc63e47b2dccaa11e468c8a90b1a36ac80c15\": context canceled" logger="UnhandledError" Feb 25 15:50:35 crc kubenswrapper[4937]: E0225 15:50:35.777429 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:e2eed51552d2dd4a9fa9dd28b6fcc63e47b2dccaa11e468c8a90b1a36ac80c15: Get \\\"https://registry.redhat.io/v2/redhat/certified-operator-index/blobs/sha256:e2eed51552d2dd4a9fa9dd28b6fcc63e47b2dccaa11e468c8a90b1a36ac80c15\\\": context canceled\"" pod="openshift-marketplace/certified-operators-c4b95" podUID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" Feb 25 15:50:36 crc kubenswrapper[4937]: I0225 15:50:36.092800 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:50:36 crc kubenswrapper[4937]: I0225 15:50:36.092891 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:50:38 crc kubenswrapper[4937]: I0225 15:50:38.402291 4937 patch_prober.go:28] interesting pod/controller-manager-7d84cf4d9-8w4zr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 15:50:38 crc kubenswrapper[4937]: I0225 15:50:38.402383 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" podUID="f4250b20-1a8c-4c0c-88c0-2598c1e07503" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 15:50:38 crc kubenswrapper[4937]: I0225 15:50:38.421865 4937 patch_prober.go:28] interesting pod/route-controller-manager-7bb6fb86f5-w5qd9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 15:50:38 crc kubenswrapper[4937]: I0225 15:50:38.421946 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" podUID="611bc243-950b-47fd-8341-f4a3ae40e27e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 15:50:41 crc kubenswrapper[4937]: I0225 15:50:41.495131 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 15:50:41 crc kubenswrapper[4937]: I0225 15:50:41.495890 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 15:50:43 crc kubenswrapper[4937]: E0225 15:50:43.673698 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-c4b95" podUID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" Feb 25 15:50:46 crc kubenswrapper[4937]: I0225 15:50:46.095335 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:50:46 crc kubenswrapper[4937]: I0225 15:50:46.095400 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.402241 4937 patch_prober.go:28] interesting pod/controller-manager-7d84cf4d9-8w4zr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.403600 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" podUID="f4250b20-1a8c-4c0c-88c0-2598c1e07503" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.421956 4937 patch_prober.go:28] interesting pod/route-controller-manager-7bb6fb86f5-w5qd9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.422061 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" podUID="611bc243-950b-47fd-8341-f4a3ae40e27e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.740644 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.748739 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.784538 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74c6468575-pmbdt"] Feb 25 15:50:48 crc kubenswrapper[4937]: E0225 15:50:48.784909 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4250b20-1a8c-4c0c-88c0-2598c1e07503" containerName="controller-manager" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.784932 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4250b20-1a8c-4c0c-88c0-2598c1e07503" containerName="controller-manager" Feb 25 15:50:48 crc kubenswrapper[4937]: E0225 15:50:48.784956 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611bc243-950b-47fd-8341-f4a3ae40e27e" containerName="route-controller-manager" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.784969 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="611bc243-950b-47fd-8341-f4a3ae40e27e" containerName="route-controller-manager" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.785166 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="611bc243-950b-47fd-8341-f4a3ae40e27e" containerName="route-controller-manager" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.785195 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4250b20-1a8c-4c0c-88c0-2598c1e07503" containerName="controller-manager" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.785870 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.795522 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74c6468575-pmbdt"] Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.840853 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/611bc243-950b-47fd-8341-f4a3ae40e27e-client-ca\") pod \"611bc243-950b-47fd-8341-f4a3ae40e27e\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.840955 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-proxy-ca-bundles\") pod \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.841019 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-client-ca\") pod \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.841073 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/611bc243-950b-47fd-8341-f4a3ae40e27e-config\") pod \"611bc243-950b-47fd-8341-f4a3ae40e27e\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.841148 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6tzq\" (UniqueName: \"kubernetes.io/projected/f4250b20-1a8c-4c0c-88c0-2598c1e07503-kube-api-access-v6tzq\") pod \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.841181 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4250b20-1a8c-4c0c-88c0-2598c1e07503-serving-cert\") pod \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.841214 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq2ck\" (UniqueName: \"kubernetes.io/projected/611bc243-950b-47fd-8341-f4a3ae40e27e-kube-api-access-tq2ck\") pod \"611bc243-950b-47fd-8341-f4a3ae40e27e\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.841244 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611bc243-950b-47fd-8341-f4a3ae40e27e-serving-cert\") pod \"611bc243-950b-47fd-8341-f4a3ae40e27e\" (UID: \"611bc243-950b-47fd-8341-f4a3ae40e27e\") " Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.841279 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-config\") pod \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\" (UID: \"f4250b20-1a8c-4c0c-88c0-2598c1e07503\") " Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.841512 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-proxy-ca-bundles\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.841570 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-serving-cert\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.841699 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gwbh\" (UniqueName: \"kubernetes.io/projected/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-kube-api-access-6gwbh\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.841771 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-config\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.841851 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-client-ca\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.842335 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/611bc243-950b-47fd-8341-f4a3ae40e27e-client-ca" (OuterVolumeSpecName: "client-ca") pod "611bc243-950b-47fd-8341-f4a3ae40e27e" (UID: "611bc243-950b-47fd-8341-f4a3ae40e27e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.842968 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-config" (OuterVolumeSpecName: "config") pod "f4250b20-1a8c-4c0c-88c0-2598c1e07503" (UID: "f4250b20-1a8c-4c0c-88c0-2598c1e07503"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.842994 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f4250b20-1a8c-4c0c-88c0-2598c1e07503" (UID: "f4250b20-1a8c-4c0c-88c0-2598c1e07503"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.843415 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/611bc243-950b-47fd-8341-f4a3ae40e27e-config" (OuterVolumeSpecName: "config") pod "611bc243-950b-47fd-8341-f4a3ae40e27e" (UID: "611bc243-950b-47fd-8341-f4a3ae40e27e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.843537 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-client-ca" (OuterVolumeSpecName: "client-ca") pod "f4250b20-1a8c-4c0c-88c0-2598c1e07503" (UID: "f4250b20-1a8c-4c0c-88c0-2598c1e07503"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.851455 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611bc243-950b-47fd-8341-f4a3ae40e27e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "611bc243-950b-47fd-8341-f4a3ae40e27e" (UID: "611bc243-950b-47fd-8341-f4a3ae40e27e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.852762 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/611bc243-950b-47fd-8341-f4a3ae40e27e-kube-api-access-tq2ck" (OuterVolumeSpecName: "kube-api-access-tq2ck") pod "611bc243-950b-47fd-8341-f4a3ae40e27e" (UID: "611bc243-950b-47fd-8341-f4a3ae40e27e"). InnerVolumeSpecName "kube-api-access-tq2ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.852886 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4250b20-1a8c-4c0c-88c0-2598c1e07503-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f4250b20-1a8c-4c0c-88c0-2598c1e07503" (UID: "f4250b20-1a8c-4c0c-88c0-2598c1e07503"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.852917 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4250b20-1a8c-4c0c-88c0-2598c1e07503-kube-api-access-v6tzq" (OuterVolumeSpecName: "kube-api-access-v6tzq") pod "f4250b20-1a8c-4c0c-88c0-2598c1e07503" (UID: "f4250b20-1a8c-4c0c-88c0-2598c1e07503"). InnerVolumeSpecName "kube-api-access-v6tzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.943234 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gwbh\" (UniqueName: \"kubernetes.io/projected/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-kube-api-access-6gwbh\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.943308 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-config\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.944649 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-client-ca\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.944880 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-proxy-ca-bundles\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.945097 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-serving-cert\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.944945 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-config\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.945654 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6tzq\" (UniqueName: \"kubernetes.io/projected/f4250b20-1a8c-4c0c-88c0-2598c1e07503-kube-api-access-v6tzq\") on node \"crc\" DevicePath \"\"" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.945886 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4250b20-1a8c-4c0c-88c0-2598c1e07503-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.946045 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq2ck\" (UniqueName: \"kubernetes.io/projected/611bc243-950b-47fd-8341-f4a3ae40e27e-kube-api-access-tq2ck\") on node \"crc\" DevicePath \"\"" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.946196 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611bc243-950b-47fd-8341-f4a3ae40e27e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.946347 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.946519 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/611bc243-950b-47fd-8341-f4a3ae40e27e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.946678 4937 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.946792 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-client-ca\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.945760 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-proxy-ca-bundles\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.946991 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4250b20-1a8c-4c0c-88c0-2598c1e07503-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.947122 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/611bc243-950b-47fd-8341-f4a3ae40e27e-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.949220 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-serving-cert\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:48 crc kubenswrapper[4937]: I0225 15:50:48.964129 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gwbh\" (UniqueName: \"kubernetes.io/projected/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-kube-api-access-6gwbh\") pod \"controller-manager-74c6468575-pmbdt\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:49 crc kubenswrapper[4937]: I0225 15:50:49.122313 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:50:49 crc kubenswrapper[4937]: I0225 15:50:49.236590 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" Feb 25 15:50:49 crc kubenswrapper[4937]: I0225 15:50:49.236620 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr" event={"ID":"f4250b20-1a8c-4c0c-88c0-2598c1e07503","Type":"ContainerDied","Data":"8fce08b94c06222ea50e752e2387d356cb7a26ee3a0fd505291babd73149428d"} Feb 25 15:50:49 crc kubenswrapper[4937]: I0225 15:50:49.236925 4937 scope.go:117] "RemoveContainer" containerID="3eb1fefb0cabbd7a0e8143606ae4740aee2fcecba90cf22e30a19282ad17b6dd" Feb 25 15:50:49 crc kubenswrapper[4937]: I0225 15:50:49.239916 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" event={"ID":"611bc243-950b-47fd-8341-f4a3ae40e27e","Type":"ContainerDied","Data":"3e23b3929af50ed0b59e8afb5ccaf24a28719ecd68cc3cb34075bd19c5629409"} Feb 25 15:50:49 crc kubenswrapper[4937]: I0225 15:50:49.239980 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9" Feb 25 15:50:49 crc kubenswrapper[4937]: I0225 15:50:49.295738 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9"] Feb 25 15:50:49 crc kubenswrapper[4937]: I0225 15:50:49.309414 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bb6fb86f5-w5qd9"] Feb 25 15:50:49 crc kubenswrapper[4937]: I0225 15:50:49.316919 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr"] Feb 25 15:50:49 crc kubenswrapper[4937]: I0225 15:50:49.320290 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7d84cf4d9-8w4zr"] Feb 25 15:50:49 crc kubenswrapper[4937]: I0225 15:50:49.375748 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="611bc243-950b-47fd-8341-f4a3ae40e27e" path="/var/lib/kubelet/pods/611bc243-950b-47fd-8341-f4a3ae40e27e/volumes" Feb 25 15:50:49 crc kubenswrapper[4937]: I0225 15:50:49.376427 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4250b20-1a8c-4c0c-88c0-2598c1e07503" path="/var/lib/kubelet/pods/f4250b20-1a8c-4c0c-88c0-2598c1e07503/volumes" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.086798 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb"] Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.087995 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.090824 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.091459 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.094007 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.094194 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.094627 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.094803 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.106615 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb"] Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.183943 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8szzd\" (UniqueName: \"kubernetes.io/projected/becb881e-db45-4958-90a0-0d20250d2ff1-kube-api-access-8szzd\") pod \"route-controller-manager-65cb4f6c9c-2z2lb\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.184036 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/becb881e-db45-4958-90a0-0d20250d2ff1-config\") pod \"route-controller-manager-65cb4f6c9c-2z2lb\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.184111 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/becb881e-db45-4958-90a0-0d20250d2ff1-client-ca\") pod \"route-controller-manager-65cb4f6c9c-2z2lb\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.184354 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/becb881e-db45-4958-90a0-0d20250d2ff1-serving-cert\") pod \"route-controller-manager-65cb4f6c9c-2z2lb\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.285831 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/becb881e-db45-4958-90a0-0d20250d2ff1-client-ca\") pod \"route-controller-manager-65cb4f6c9c-2z2lb\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.285935 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/becb881e-db45-4958-90a0-0d20250d2ff1-serving-cert\") pod \"route-controller-manager-65cb4f6c9c-2z2lb\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.285983 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8szzd\" (UniqueName: \"kubernetes.io/projected/becb881e-db45-4958-90a0-0d20250d2ff1-kube-api-access-8szzd\") pod \"route-controller-manager-65cb4f6c9c-2z2lb\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.286023 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/becb881e-db45-4958-90a0-0d20250d2ff1-config\") pod \"route-controller-manager-65cb4f6c9c-2z2lb\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.287725 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/becb881e-db45-4958-90a0-0d20250d2ff1-config\") pod \"route-controller-manager-65cb4f6c9c-2z2lb\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.291433 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/becb881e-db45-4958-90a0-0d20250d2ff1-client-ca\") pod \"route-controller-manager-65cb4f6c9c-2z2lb\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.295258 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/becb881e-db45-4958-90a0-0d20250d2ff1-serving-cert\") pod \"route-controller-manager-65cb4f6c9c-2z2lb\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.325794 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8szzd\" (UniqueName: \"kubernetes.io/projected/becb881e-db45-4958-90a0-0d20250d2ff1-kube-api-access-8szzd\") pod \"route-controller-manager-65cb4f6c9c-2z2lb\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:50:51 crc kubenswrapper[4937]: I0225 15:50:51.414195 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:50:56 crc kubenswrapper[4937]: I0225 15:50:56.092713 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:50:56 crc kubenswrapper[4937]: I0225 15:50:56.093464 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:51:01 crc kubenswrapper[4937]: E0225 15:51:01.084182 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage4121943614/3\": happened during read: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 25 15:51:01 crc kubenswrapper[4937]: E0225 15:51:01.084808 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v8d7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-sbgvv_openshift-marketplace(fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage4121943614/3\": happened during read: context canceled" logger="UnhandledError" Feb 25 15:51:01 crc kubenswrapper[4937]: E0225 15:51:01.086104 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage4121943614/3\\\": happened during read: context canceled\"" pod="openshift-marketplace/community-operators-sbgvv" podUID="fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" Feb 25 15:51:04 crc kubenswrapper[4937]: I0225 15:51:04.281850 4937 ???:1] "http: TLS handshake error from 192.168.126.11:36634: no serving certificate available for the kubelet" Feb 25 15:51:06 crc kubenswrapper[4937]: I0225 15:51:06.092943 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:51:06 crc kubenswrapper[4937]: I0225 15:51:06.093464 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:51:11 crc kubenswrapper[4937]: I0225 15:51:11.495303 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 15:51:11 crc kubenswrapper[4937]: I0225 15:51:11.495381 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 15:51:11 crc kubenswrapper[4937]: I0225 15:51:11.495430 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:51:11 crc kubenswrapper[4937]: I0225 15:51:11.496187 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 15:51:11 crc kubenswrapper[4937]: I0225 15:51:11.496266 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255" gracePeriod=600 Feb 25 15:51:12 crc kubenswrapper[4937]: I0225 15:51:12.173600 4937 scope.go:117] "RemoveContainer" containerID="c78611f4b28b41f99492fc1502eb7e5f76b268b0ecf133a509992888b71b01bc" Feb 25 15:51:12 crc kubenswrapper[4937]: E0225 15:51:12.365694 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage2182226390/3\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 25 15:51:12 crc kubenswrapper[4937]: E0225 15:51:12.365866 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2jjdr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mrgxt_openshift-marketplace(3c753535-03f4-4888-8e28-43b4924726ae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage2182226390/3\": happened during read: context canceled" logger="UnhandledError" Feb 25 15:51:12 crc kubenswrapper[4937]: E0225 15:51:12.367032 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage2182226390/3\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mrgxt" podUID="3c753535-03f4-4888-8e28-43b4924726ae" Feb 25 15:51:12 crc kubenswrapper[4937]: I0225 15:51:12.395879 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255" exitCode=0 Feb 25 15:51:12 crc kubenswrapper[4937]: I0225 15:51:12.395960 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255"} Feb 25 15:51:16 crc kubenswrapper[4937]: I0225 15:51:16.095515 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:51:16 crc kubenswrapper[4937]: I0225 15:51:16.095961 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:51:16 crc kubenswrapper[4937]: E0225 15:51:16.233393 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363\": context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 25 15:51:16 crc kubenswrapper[4937]: E0225 15:51:16.233601 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4wwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-l9tlm_openshift-marketplace(fae7d336-b701-4174-bdae-bd3f1bc032b1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363\": context canceled" logger="UnhandledError" Feb 25 15:51:16 crc kubenswrapper[4937]: E0225 15:51:16.234838 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363: Get \\\"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363\\\": context canceled\"" pod="openshift-marketplace/redhat-operators-l9tlm" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" Feb 25 15:51:16 crc kubenswrapper[4937]: E0225 15:51:16.628730 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363\": context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 25 15:51:16 crc kubenswrapper[4937]: E0225 15:51:16.629273 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgc6d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-scndr_openshift-marketplace(dc970acf-3cdb-4951-8f35-705ce003550f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363\": context canceled" logger="UnhandledError" Feb 25 15:51:16 crc kubenswrapper[4937]: E0225 15:51:16.630901 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363: Get \\\"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363\\\": context canceled\"" pod="openshift-marketplace/redhat-operators-scndr" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" Feb 25 15:51:17 crc kubenswrapper[4937]: E0225 15:51:17.396260 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mrgxt" podUID="3c753535-03f4-4888-8e28-43b4924726ae" Feb 25 15:51:17 crc kubenswrapper[4937]: E0225 15:51:17.396350 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-l9tlm" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" Feb 25 15:51:17 crc kubenswrapper[4937]: E0225 15:51:17.518714 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:2086b7801d96d309e48e1c678789d95541de89bbae905e6f5a8de845927ca051: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:2086b7801d96d309e48e1c678789d95541de89bbae905e6f5a8de845927ca051\": context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 25 15:51:17 crc kubenswrapper[4937]: E0225 15:51:17.518992 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7zktg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-gsxxs_openshift-marketplace(e1e375ad-9093-48c0-8f06-d8ae9ad9b46b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:2086b7801d96d309e48e1c678789d95541de89bbae905e6f5a8de845927ca051: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:2086b7801d96d309e48e1c678789d95541de89bbae905e6f5a8de845927ca051\": context canceled" logger="UnhandledError" Feb 25 15:51:17 crc kubenswrapper[4937]: E0225 15:51:17.520241 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:2086b7801d96d309e48e1c678789d95541de89bbae905e6f5a8de845927ca051: Get \\\"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:2086b7801d96d309e48e1c678789d95541de89bbae905e6f5a8de845927ca051\\\": context canceled\"" pod="openshift-marketplace/redhat-marketplace-gsxxs" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" Feb 25 15:51:19 crc kubenswrapper[4937]: E0225 15:51:19.228570 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-gsxxs" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" Feb 25 15:51:19 crc kubenswrapper[4937]: E0225 15:51:19.229665 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-scndr" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" Feb 25 15:51:19 crc kubenswrapper[4937]: E0225 15:51:19.342597 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 25 15:51:19 crc kubenswrapper[4937]: E0225 15:51:19.342780 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h4vcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-54sqd_openshift-marketplace(af3c547e-6bf2-4fd5-b375-5ad1c2c6959c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 15:51:19 crc kubenswrapper[4937]: E0225 15:51:19.344180 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-54sqd" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" Feb 25 15:51:20 crc kubenswrapper[4937]: E0225 15:51:20.671684 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-54sqd" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" Feb 25 15:51:20 crc kubenswrapper[4937]: E0225 15:51:20.780520 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 25 15:51:20 crc kubenswrapper[4937]: E0225 15:51:20.781205 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ppp72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-l8xkp_openshift-marketplace(534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 15:51:20 crc kubenswrapper[4937]: E0225 15:51:20.782727 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-l8xkp" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" Feb 25 15:51:20 crc kubenswrapper[4937]: I0225 15:51:20.881060 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 25 15:51:21 crc kubenswrapper[4937]: I0225 15:51:21.298666 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 25 15:51:21 crc kubenswrapper[4937]: I0225 15:51:21.408144 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74c6468575-pmbdt"] Feb 25 15:51:21 crc kubenswrapper[4937]: I0225 15:51:21.438599 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb"] Feb 25 15:51:21 crc kubenswrapper[4937]: I0225 15:51:21.468093 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"c9825ea4151fdf2ed4aea12d0ed9b0d4287c1ad0da21c88a4d5b343d65fcffef"} Feb 25 15:51:21 crc kubenswrapper[4937]: I0225 15:51:21.470974 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vd8vf" event={"ID":"d9c49432-4c74-4842-bdd2-880414a4ad0a","Type":"ContainerStarted","Data":"54dbeb18da1c55efc083feced805ec943117fc244ceb2e892952dfb16498cdde"} Feb 25 15:51:21 crc kubenswrapper[4937]: I0225 15:51:21.472303 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-vd8vf" Feb 25 15:51:21 crc kubenswrapper[4937]: I0225 15:51:21.472493 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:51:21 crc kubenswrapper[4937]: I0225 15:51:21.472546 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:51:21 crc kubenswrapper[4937]: I0225 15:51:21.476557 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e8624c64-7d59-4173-8928-e7dff50f1039","Type":"ContainerStarted","Data":"e0a0898652bff388c81683ce0ec24e45cf61aa40e4e64faf47834486db0cbd1c"} Feb 25 15:51:21 crc kubenswrapper[4937]: I0225 15:51:21.477925 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3441d1f7-f511-4231-8dc1-a522a8e572e8","Type":"ContainerStarted","Data":"c38863ba8e247a3a0c8d3a8a7a54f716366ca347cc0b5dc5e68fe8bb1013fa92"} Feb 25 15:51:22 crc kubenswrapper[4937]: I0225 15:51:22.106201 4937 csr.go:261] certificate signing request csr-r5pb4 is approved, waiting to be issued Feb 25 15:51:22 crc kubenswrapper[4937]: I0225 15:51:22.114532 4937 csr.go:257] certificate signing request csr-r5pb4 is issued Feb 25 15:51:22 crc kubenswrapper[4937]: W0225 15:51:22.405539 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1a4d531_e2dd_460e_ba0c_2c6572c5dea4.slice/crio-2acdb5949b753bb791267d5f873350f07d7865397a6ef6fe01e4e3154d4eb13e WatchSource:0}: Error finding container 2acdb5949b753bb791267d5f873350f07d7865397a6ef6fe01e4e3154d4eb13e: Status 404 returned error can't find the container with id 2acdb5949b753bb791267d5f873350f07d7865397a6ef6fe01e4e3154d4eb13e Feb 25 15:51:22 crc kubenswrapper[4937]: E0225 15:51:22.406287 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-l8xkp" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" Feb 25 15:51:22 crc kubenswrapper[4937]: I0225 15:51:22.495896 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" event={"ID":"becb881e-db45-4958-90a0-0d20250d2ff1","Type":"ContainerStarted","Data":"668898a46515f27931afa571ec096fb1253b101be70fb535651109e43fb5d48f"} Feb 25 15:51:22 crc kubenswrapper[4937]: I0225 15:51:22.497836 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e8624c64-7d59-4173-8928-e7dff50f1039","Type":"ContainerStarted","Data":"062bc97550bb9a8d6bb08e3a94bfe79a1a2302a233315a3830380be5d642fcd7"} Feb 25 15:51:22 crc kubenswrapper[4937]: I0225 15:51:22.501769 4937 generic.go:334] "Generic (PLEG): container finished" podID="0c63f82a-9346-476b-ae17-edb260b2a36f" containerID="7d4a5e1a11786b57a50a8e25125cd85bfbc33b6976f9e0ae9c6b9672b97193b3" exitCode=0 Feb 25 15:51:22 crc kubenswrapper[4937]: I0225 15:51:22.501901 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533908-nq4vc" event={"ID":"0c63f82a-9346-476b-ae17-edb260b2a36f","Type":"ContainerDied","Data":"7d4a5e1a11786b57a50a8e25125cd85bfbc33b6976f9e0ae9c6b9672b97193b3"} Feb 25 15:51:22 crc kubenswrapper[4937]: I0225 15:51:22.503291 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" event={"ID":"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4","Type":"ContainerStarted","Data":"2acdb5949b753bb791267d5f873350f07d7865397a6ef6fe01e4e3154d4eb13e"} Feb 25 15:51:22 crc kubenswrapper[4937]: I0225 15:51:22.508445 4937 generic.go:334] "Generic (PLEG): container finished" podID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" containerID="d2c599db28b3599d1a3907d6cf44f0728dd9f31fdd7e2f554c9e80cda476b94c" exitCode=0 Feb 25 15:51:22 crc kubenswrapper[4937]: I0225 15:51:22.508531 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4b95" event={"ID":"2506466d-db79-4b0e-a2df-2d64c56ad7cd","Type":"ContainerDied","Data":"d2c599db28b3599d1a3907d6cf44f0728dd9f31fdd7e2f554c9e80cda476b94c"} Feb 25 15:51:22 crc kubenswrapper[4937]: I0225 15:51:22.518983 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=47.518930556 podStartE2EDuration="47.518930556s" podCreationTimestamp="2026-02-25 15:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:51:22.516085103 +0000 UTC m=+333.529477003" watchObservedRunningTime="2026-02-25 15:51:22.518930556 +0000 UTC m=+333.532322446" Feb 25 15:51:22 crc kubenswrapper[4937]: I0225 15:51:22.519794 4937 generic.go:334] "Generic (PLEG): container finished" podID="6e6e93a3-6673-464f-84a3-5585a6cbc0a8" containerID="659ed1300e90273684c34bf480b419cb492c3a5b6b5118aaba0086df07266833" exitCode=0 Feb 25 15:51:22 crc kubenswrapper[4937]: I0225 15:51:22.520705 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533910-n4bgf" event={"ID":"6e6e93a3-6673-464f-84a3-5585a6cbc0a8","Type":"ContainerDied","Data":"659ed1300e90273684c34bf480b419cb492c3a5b6b5118aaba0086df07266833"} Feb 25 15:51:22 crc kubenswrapper[4937]: I0225 15:51:22.521662 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:51:22 crc kubenswrapper[4937]: I0225 15:51:22.521718 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.116407 4937 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-10 22:44:38.417309111 +0000 UTC Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.117026 4937 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6918h53m15.300287583s for next certificate rotation Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.528058 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3441d1f7-f511-4231-8dc1-a522a8e572e8","Type":"ContainerStarted","Data":"95a37757bdd7786931f80caf54c2b3bc4973a61c00ee6e42b06fd485be1fb71d"} Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.529805 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" event={"ID":"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4","Type":"ContainerStarted","Data":"0bbeaa7e6741fe652f1a8eb13a12f304dd439e906b21179a15114b20dda839b0"} Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.530016 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.532069 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4b95" event={"ID":"2506466d-db79-4b0e-a2df-2d64c56ad7cd","Type":"ContainerStarted","Data":"311dae1aff7b2cf4299fc6e995712669ef0a537a831b933fc89a8ea39688f991"} Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.534728 4937 generic.go:334] "Generic (PLEG): container finished" podID="fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" containerID="b0a3761169e64f2100446d82ebf67a7d7bdb0d3e34358469dbbe32515344cb77" exitCode=0 Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.534812 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbgvv" event={"ID":"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9","Type":"ContainerDied","Data":"b0a3761169e64f2100446d82ebf67a7d7bdb0d3e34358469dbbe32515344cb77"} Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.537876 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.538913 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" event={"ID":"becb881e-db45-4958-90a0-0d20250d2ff1","Type":"ContainerStarted","Data":"df25e3ac8c6cd219bc5ce83338ca874e4499ce3d25028898b77497c221eba258"} Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.538950 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.539740 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.539775 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.548623 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=53.548599192 podStartE2EDuration="53.548599192s" podCreationTimestamp="2026-02-25 15:50:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:51:23.545723727 +0000 UTC m=+334.559115627" watchObservedRunningTime="2026-02-25 15:51:23.548599192 +0000 UTC m=+334.561991072" Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.566450 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" podStartSLOduration=49.566425921 podStartE2EDuration="49.566425921s" podCreationTimestamp="2026-02-25 15:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:51:23.563163627 +0000 UTC m=+334.576555517" watchObservedRunningTime="2026-02-25 15:51:23.566425921 +0000 UTC m=+334.579817831" Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.591156 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c4b95" podStartSLOduration=2.099461016 podStartE2EDuration="1m28.591126627s" podCreationTimestamp="2026-02-25 15:49:55 +0000 UTC" firstStartedPulling="2026-02-25 15:49:56.638093205 +0000 UTC m=+247.651485095" lastFinishedPulling="2026-02-25 15:51:23.129758816 +0000 UTC m=+334.143150706" observedRunningTime="2026-02-25 15:51:23.587501723 +0000 UTC m=+334.600893633" watchObservedRunningTime="2026-02-25 15:51:23.591126627 +0000 UTC m=+334.604518517" Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.612784 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" podStartSLOduration=49.612761544 podStartE2EDuration="49.612761544s" podCreationTimestamp="2026-02-25 15:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:51:23.611274695 +0000 UTC m=+334.624666585" watchObservedRunningTime="2026-02-25 15:51:23.612761544 +0000 UTC m=+334.626153434" Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.663038 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.973935 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533910-n4bgf" Feb 25 15:51:23 crc kubenswrapper[4937]: I0225 15:51:23.988577 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533908-nq4vc" Feb 25 15:51:24 crc kubenswrapper[4937]: I0225 15:51:24.088970 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krm9b\" (UniqueName: \"kubernetes.io/projected/6e6e93a3-6673-464f-84a3-5585a6cbc0a8-kube-api-access-krm9b\") pod \"6e6e93a3-6673-464f-84a3-5585a6cbc0a8\" (UID: \"6e6e93a3-6673-464f-84a3-5585a6cbc0a8\") " Feb 25 15:51:24 crc kubenswrapper[4937]: I0225 15:51:24.097170 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e6e93a3-6673-464f-84a3-5585a6cbc0a8-kube-api-access-krm9b" (OuterVolumeSpecName: "kube-api-access-krm9b") pod "6e6e93a3-6673-464f-84a3-5585a6cbc0a8" (UID: "6e6e93a3-6673-464f-84a3-5585a6cbc0a8"). InnerVolumeSpecName "kube-api-access-krm9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:51:24 crc kubenswrapper[4937]: I0225 15:51:24.190856 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmp7m\" (UniqueName: \"kubernetes.io/projected/0c63f82a-9346-476b-ae17-edb260b2a36f-kube-api-access-qmp7m\") pod \"0c63f82a-9346-476b-ae17-edb260b2a36f\" (UID: \"0c63f82a-9346-476b-ae17-edb260b2a36f\") " Feb 25 15:51:24 crc kubenswrapper[4937]: I0225 15:51:24.191512 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krm9b\" (UniqueName: \"kubernetes.io/projected/6e6e93a3-6673-464f-84a3-5585a6cbc0a8-kube-api-access-krm9b\") on node \"crc\" DevicePath \"\"" Feb 25 15:51:24 crc kubenswrapper[4937]: I0225 15:51:24.195948 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c63f82a-9346-476b-ae17-edb260b2a36f-kube-api-access-qmp7m" (OuterVolumeSpecName: "kube-api-access-qmp7m") pod "0c63f82a-9346-476b-ae17-edb260b2a36f" (UID: "0c63f82a-9346-476b-ae17-edb260b2a36f"). InnerVolumeSpecName "kube-api-access-qmp7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:51:24 crc kubenswrapper[4937]: I0225 15:51:24.292891 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmp7m\" (UniqueName: \"kubernetes.io/projected/0c63f82a-9346-476b-ae17-edb260b2a36f-kube-api-access-qmp7m\") on node \"crc\" DevicePath \"\"" Feb 25 15:51:24 crc kubenswrapper[4937]: I0225 15:51:24.544442 4937 generic.go:334] "Generic (PLEG): container finished" podID="3441d1f7-f511-4231-8dc1-a522a8e572e8" containerID="95a37757bdd7786931f80caf54c2b3bc4973a61c00ee6e42b06fd485be1fb71d" exitCode=0 Feb 25 15:51:24 crc kubenswrapper[4937]: I0225 15:51:24.544537 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3441d1f7-f511-4231-8dc1-a522a8e572e8","Type":"ContainerDied","Data":"95a37757bdd7786931f80caf54c2b3bc4973a61c00ee6e42b06fd485be1fb71d"} Feb 25 15:51:24 crc kubenswrapper[4937]: I0225 15:51:24.546258 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533910-n4bgf" event={"ID":"6e6e93a3-6673-464f-84a3-5585a6cbc0a8","Type":"ContainerDied","Data":"4820e8c1cb17bdca4851c2b966ae4968e408f258e779cddaac92c9744df71df4"} Feb 25 15:51:24 crc kubenswrapper[4937]: I0225 15:51:24.546289 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4820e8c1cb17bdca4851c2b966ae4968e408f258e779cddaac92c9744df71df4" Feb 25 15:51:24 crc kubenswrapper[4937]: I0225 15:51:24.546272 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533910-n4bgf" Feb 25 15:51:24 crc kubenswrapper[4937]: I0225 15:51:24.549151 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533908-nq4vc" Feb 25 15:51:24 crc kubenswrapper[4937]: I0225 15:51:24.552302 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533908-nq4vc" event={"ID":"0c63f82a-9346-476b-ae17-edb260b2a36f","Type":"ContainerDied","Data":"153d2a13710035e0387f4054ac624b53d6df4499ed57d91be0f0217848cd6dfd"} Feb 25 15:51:24 crc kubenswrapper[4937]: I0225 15:51:24.552336 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="153d2a13710035e0387f4054ac624b53d6df4499ed57d91be0f0217848cd6dfd" Feb 25 15:51:25 crc kubenswrapper[4937]: I0225 15:51:25.601411 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:51:25 crc kubenswrapper[4937]: I0225 15:51:25.602223 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:51:25 crc kubenswrapper[4937]: I0225 15:51:25.909685 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 15:51:25 crc kubenswrapper[4937]: I0225 15:51:25.924859 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3441d1f7-f511-4231-8dc1-a522a8e572e8-kube-api-access\") pod \"3441d1f7-f511-4231-8dc1-a522a8e572e8\" (UID: \"3441d1f7-f511-4231-8dc1-a522a8e572e8\") " Feb 25 15:51:25 crc kubenswrapper[4937]: I0225 15:51:25.924913 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3441d1f7-f511-4231-8dc1-a522a8e572e8-kubelet-dir\") pod \"3441d1f7-f511-4231-8dc1-a522a8e572e8\" (UID: \"3441d1f7-f511-4231-8dc1-a522a8e572e8\") " Feb 25 15:51:25 crc kubenswrapper[4937]: I0225 15:51:25.924990 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3441d1f7-f511-4231-8dc1-a522a8e572e8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3441d1f7-f511-4231-8dc1-a522a8e572e8" (UID: "3441d1f7-f511-4231-8dc1-a522a8e572e8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 15:51:25 crc kubenswrapper[4937]: I0225 15:51:25.925258 4937 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3441d1f7-f511-4231-8dc1-a522a8e572e8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 25 15:51:25 crc kubenswrapper[4937]: I0225 15:51:25.942612 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3441d1f7-f511-4231-8dc1-a522a8e572e8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3441d1f7-f511-4231-8dc1-a522a8e572e8" (UID: "3441d1f7-f511-4231-8dc1-a522a8e572e8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:51:26 crc kubenswrapper[4937]: I0225 15:51:26.026328 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3441d1f7-f511-4231-8dc1-a522a8e572e8-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 15:51:26 crc kubenswrapper[4937]: I0225 15:51:26.092394 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:51:26 crc kubenswrapper[4937]: I0225 15:51:26.092456 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:51:26 crc kubenswrapper[4937]: I0225 15:51:26.092591 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:51:26 crc kubenswrapper[4937]: I0225 15:51:26.092651 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:51:26 crc kubenswrapper[4937]: I0225 15:51:26.567069 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3441d1f7-f511-4231-8dc1-a522a8e572e8","Type":"ContainerDied","Data":"c38863ba8e247a3a0c8d3a8a7a54f716366ca347cc0b5dc5e68fe8bb1013fa92"} Feb 25 15:51:26 crc kubenswrapper[4937]: I0225 15:51:26.567181 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c38863ba8e247a3a0c8d3a8a7a54f716366ca347cc0b5dc5e68fe8bb1013fa92" Feb 25 15:51:26 crc kubenswrapper[4937]: I0225 15:51:26.567116 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 15:51:27 crc kubenswrapper[4937]: I0225 15:51:27.345659 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-c4b95" podUID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" containerName="registry-server" probeResult="failure" output=< Feb 25 15:51:27 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 15:51:27 crc kubenswrapper[4937]: > Feb 25 15:51:29 crc kubenswrapper[4937]: I0225 15:51:29.587065 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbgvv" event={"ID":"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9","Type":"ContainerStarted","Data":"6fcb26a16d5d62bcc02d7506965dab9a5b707774bcab2a63cf1cfbd03df8e044"} Feb 25 15:51:29 crc kubenswrapper[4937]: I0225 15:51:29.621429 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sbgvv" podStartSLOduration=9.804223165 podStartE2EDuration="1m34.621408613s" podCreationTimestamp="2026-02-25 15:49:55 +0000 UTC" firstStartedPulling="2026-02-25 15:50:03.769233182 +0000 UTC m=+254.782625082" lastFinishedPulling="2026-02-25 15:51:28.58641862 +0000 UTC m=+339.599810530" observedRunningTime="2026-02-25 15:51:29.617596915 +0000 UTC m=+340.630988815" watchObservedRunningTime="2026-02-25 15:51:29.621408613 +0000 UTC m=+340.634800513" Feb 25 15:51:35 crc kubenswrapper[4937]: I0225 15:51:35.698910 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:51:35 crc kubenswrapper[4937]: I0225 15:51:35.699947 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:51:35 crc kubenswrapper[4937]: I0225 15:51:35.757391 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:51:35 crc kubenswrapper[4937]: I0225 15:51:35.766575 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:51:35 crc kubenswrapper[4937]: I0225 15:51:35.822064 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:51:36 crc kubenswrapper[4937]: I0225 15:51:36.092753 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:51:36 crc kubenswrapper[4937]: I0225 15:51:36.092784 4937 patch_prober.go:28] interesting pod/downloads-7954f5f757-vd8vf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 25 15:51:36 crc kubenswrapper[4937]: I0225 15:51:36.092860 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:51:36 crc kubenswrapper[4937]: I0225 15:51:36.092875 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vd8vf" podUID="d9c49432-4c74-4842-bdd2-880414a4ad0a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 25 15:51:36 crc kubenswrapper[4937]: I0225 15:51:36.815526 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:51:37 crc kubenswrapper[4937]: I0225 15:51:37.605571 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sbgvv"] Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.117862 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.118138 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.120550 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.120621 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.138989 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.142229 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.201795 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c4b95"] Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.202073 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c4b95" podUID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" containerName="registry-server" containerID="cri-o://311dae1aff7b2cf4299fc6e995712669ef0a537a831b933fc89a8ea39688f991" gracePeriod=2 Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.219281 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.219460 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.221377 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.232466 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.248430 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.250311 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.293346 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.424435 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.435853 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:51:38 crc kubenswrapper[4937]: I0225 15:51:38.646513 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sbgvv" podUID="fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" containerName="registry-server" containerID="cri-o://6fcb26a16d5d62bcc02d7506965dab9a5b707774bcab2a63cf1cfbd03df8e044" gracePeriod=2 Feb 25 15:51:39 crc kubenswrapper[4937]: I0225 15:51:39.654548 4937 generic.go:334] "Generic (PLEG): container finished" podID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" containerID="311dae1aff7b2cf4299fc6e995712669ef0a537a831b933fc89a8ea39688f991" exitCode=0 Feb 25 15:51:39 crc kubenswrapper[4937]: I0225 15:51:39.654631 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4b95" event={"ID":"2506466d-db79-4b0e-a2df-2d64c56ad7cd","Type":"ContainerDied","Data":"311dae1aff7b2cf4299fc6e995712669ef0a537a831b933fc89a8ea39688f991"} Feb 25 15:51:39 crc kubenswrapper[4937]: I0225 15:51:39.657270 4937 generic.go:334] "Generic (PLEG): container finished" podID="fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" containerID="6fcb26a16d5d62bcc02d7506965dab9a5b707774bcab2a63cf1cfbd03df8e044" exitCode=0 Feb 25 15:51:39 crc kubenswrapper[4937]: I0225 15:51:39.657316 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbgvv" event={"ID":"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9","Type":"ContainerDied","Data":"6fcb26a16d5d62bcc02d7506965dab9a5b707774bcab2a63cf1cfbd03df8e044"} Feb 25 15:51:45 crc kubenswrapper[4937]: E0225 15:51:45.603172 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 311dae1aff7b2cf4299fc6e995712669ef0a537a831b933fc89a8ea39688f991 is running failed: container process not found" containerID="311dae1aff7b2cf4299fc6e995712669ef0a537a831b933fc89a8ea39688f991" cmd=["grpc_health_probe","-addr=:50051"] Feb 25 15:51:45 crc kubenswrapper[4937]: E0225 15:51:45.604188 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 311dae1aff7b2cf4299fc6e995712669ef0a537a831b933fc89a8ea39688f991 is running failed: container process not found" containerID="311dae1aff7b2cf4299fc6e995712669ef0a537a831b933fc89a8ea39688f991" cmd=["grpc_health_probe","-addr=:50051"] Feb 25 15:51:45 crc kubenswrapper[4937]: E0225 15:51:45.605076 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 311dae1aff7b2cf4299fc6e995712669ef0a537a831b933fc89a8ea39688f991 is running failed: container process not found" containerID="311dae1aff7b2cf4299fc6e995712669ef0a537a831b933fc89a8ea39688f991" cmd=["grpc_health_probe","-addr=:50051"] Feb 25 15:51:45 crc kubenswrapper[4937]: E0225 15:51:45.605107 4937 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 311dae1aff7b2cf4299fc6e995712669ef0a537a831b933fc89a8ea39688f991 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-c4b95" podUID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" containerName="registry-server" Feb 25 15:51:45 crc kubenswrapper[4937]: E0225 15:51:45.699253 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6fcb26a16d5d62bcc02d7506965dab9a5b707774bcab2a63cf1cfbd03df8e044 is running failed: container process not found" containerID="6fcb26a16d5d62bcc02d7506965dab9a5b707774bcab2a63cf1cfbd03df8e044" cmd=["grpc_health_probe","-addr=:50051"] Feb 25 15:51:45 crc kubenswrapper[4937]: E0225 15:51:45.699740 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6fcb26a16d5d62bcc02d7506965dab9a5b707774bcab2a63cf1cfbd03df8e044 is running failed: container process not found" containerID="6fcb26a16d5d62bcc02d7506965dab9a5b707774bcab2a63cf1cfbd03df8e044" cmd=["grpc_health_probe","-addr=:50051"] Feb 25 15:51:45 crc kubenswrapper[4937]: E0225 15:51:45.700140 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6fcb26a16d5d62bcc02d7506965dab9a5b707774bcab2a63cf1cfbd03df8e044 is running failed: container process not found" containerID="6fcb26a16d5d62bcc02d7506965dab9a5b707774bcab2a63cf1cfbd03df8e044" cmd=["grpc_health_probe","-addr=:50051"] Feb 25 15:51:45 crc kubenswrapper[4937]: E0225 15:51:45.700208 4937 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6fcb26a16d5d62bcc02d7506965dab9a5b707774bcab2a63cf1cfbd03df8e044 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-sbgvv" podUID="fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" containerName="registry-server" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.100881 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-vd8vf" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.526025 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.531736 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.590916 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-catalog-content\") pod \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\" (UID: \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\") " Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.590965 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8d7b\" (UniqueName: \"kubernetes.io/projected/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-kube-api-access-v8d7b\") pod \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\" (UID: \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\") " Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.590990 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs9zp\" (UniqueName: \"kubernetes.io/projected/2506466d-db79-4b0e-a2df-2d64c56ad7cd-kube-api-access-zs9zp\") pod \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\" (UID: \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\") " Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.591011 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-utilities\") pod \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\" (UID: \"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9\") " Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.591041 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2506466d-db79-4b0e-a2df-2d64c56ad7cd-catalog-content\") pod \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\" (UID: \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\") " Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.591070 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2506466d-db79-4b0e-a2df-2d64c56ad7cd-utilities\") pod \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\" (UID: \"2506466d-db79-4b0e-a2df-2d64c56ad7cd\") " Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.591947 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2506466d-db79-4b0e-a2df-2d64c56ad7cd-utilities" (OuterVolumeSpecName: "utilities") pod "2506466d-db79-4b0e-a2df-2d64c56ad7cd" (UID: "2506466d-db79-4b0e-a2df-2d64c56ad7cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.592021 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2506466d-db79-4b0e-a2df-2d64c56ad7cd-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.592078 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-utilities" (OuterVolumeSpecName: "utilities") pod "fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" (UID: "fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.600006 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-kube-api-access-v8d7b" (OuterVolumeSpecName: "kube-api-access-v8d7b") pod "fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" (UID: "fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9"). InnerVolumeSpecName "kube-api-access-v8d7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.600646 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2506466d-db79-4b0e-a2df-2d64c56ad7cd-kube-api-access-zs9zp" (OuterVolumeSpecName: "kube-api-access-zs9zp") pod "2506466d-db79-4b0e-a2df-2d64c56ad7cd" (UID: "2506466d-db79-4b0e-a2df-2d64c56ad7cd"). InnerVolumeSpecName "kube-api-access-zs9zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.667350 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2506466d-db79-4b0e-a2df-2d64c56ad7cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2506466d-db79-4b0e-a2df-2d64c56ad7cd" (UID: "2506466d-db79-4b0e-a2df-2d64c56ad7cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.693777 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.693821 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2506466d-db79-4b0e-a2df-2d64c56ad7cd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.693832 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8d7b\" (UniqueName: \"kubernetes.io/projected/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-kube-api-access-v8d7b\") on node \"crc\" DevicePath \"\"" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.693841 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs9zp\" (UniqueName: \"kubernetes.io/projected/2506466d-db79-4b0e-a2df-2d64c56ad7cd-kube-api-access-zs9zp\") on node \"crc\" DevicePath \"\"" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.700843 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4b95" event={"ID":"2506466d-db79-4b0e-a2df-2d64c56ad7cd","Type":"ContainerDied","Data":"302d6d8cd67bdd1998eac072ae450db6cf29604320014c119f91dc4f1636624b"} Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.700904 4937 scope.go:117] "RemoveContainer" containerID="311dae1aff7b2cf4299fc6e995712669ef0a537a831b933fc89a8ea39688f991" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.700868 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4b95" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.702665 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbgvv" event={"ID":"fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9","Type":"ContainerDied","Data":"52fea6f79b46519ec210820a602341ea5e1eabe6a24e7ada5087bdcbefa4ad38"} Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.702751 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbgvv" Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.737204 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c4b95"] Feb 25 15:51:46 crc kubenswrapper[4937]: I0225 15:51:46.740865 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c4b95"] Feb 25 15:51:47 crc kubenswrapper[4937]: I0225 15:51:47.238033 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" (UID: "fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:51:47 crc kubenswrapper[4937]: I0225 15:51:47.299642 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 15:51:47 crc kubenswrapper[4937]: I0225 15:51:47.329126 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sbgvv"] Feb 25 15:51:47 crc kubenswrapper[4937]: I0225 15:51:47.333269 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sbgvv"] Feb 25 15:51:47 crc kubenswrapper[4937]: I0225 15:51:47.377392 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" path="/var/lib/kubelet/pods/2506466d-db79-4b0e-a2df-2d64c56ad7cd/volumes" Feb 25 15:51:47 crc kubenswrapper[4937]: I0225 15:51:47.378193 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" path="/var/lib/kubelet/pods/fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9/volumes" Feb 25 15:51:54 crc kubenswrapper[4937]: E0225 15:51:54.017070 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 25 15:51:54 crc kubenswrapper[4937]: E0225 15:51:54.019328 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2jjdr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mrgxt_openshift-marketplace(3c753535-03f4-4888-8e28-43b4924726ae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 15:51:54 crc kubenswrapper[4937]: E0225 15:51:54.020553 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mrgxt" podUID="3c753535-03f4-4888-8e28-43b4924726ae" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.801728 4937 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.802667 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6e93a3-6673-464f-84a3-5585a6cbc0a8" containerName="oc" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.802686 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6e93a3-6673-464f-84a3-5585a6cbc0a8" containerName="oc" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.802701 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" containerName="registry-server" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.802709 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" containerName="registry-server" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.802722 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" containerName="extract-utilities" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.802731 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" containerName="extract-utilities" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.802743 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" containerName="extract-content" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.802752 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" containerName="extract-content" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.802760 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" containerName="extract-utilities" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.802768 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" containerName="extract-utilities" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.802777 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3441d1f7-f511-4231-8dc1-a522a8e572e8" containerName="pruner" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.802784 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="3441d1f7-f511-4231-8dc1-a522a8e572e8" containerName="pruner" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.802800 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" containerName="registry-server" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.802808 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" containerName="registry-server" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.802821 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c63f82a-9346-476b-ae17-edb260b2a36f" containerName="oc" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.802829 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c63f82a-9346-476b-ae17-edb260b2a36f" containerName="oc" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.802842 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" containerName="extract-content" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.802850 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" containerName="extract-content" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.802967 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa82e4ce-a6cd-4be3-8cca-0a355d6efbc9" containerName="registry-server" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.802991 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="3441d1f7-f511-4231-8dc1-a522a8e572e8" containerName="pruner" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.803002 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c63f82a-9346-476b-ae17-edb260b2a36f" containerName="oc" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.803014 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="2506466d-db79-4b0e-a2df-2d64c56ad7cd" containerName="registry-server" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.803023 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e6e93a3-6673-464f-84a3-5585a6cbc0a8" containerName="oc" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.803414 4937 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.803761 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea" gracePeriod=15 Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.803868 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.803902 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa" gracePeriod=15 Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.803885 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9" gracePeriod=15 Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.803940 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52" gracePeriod=15 Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.803930 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925" gracePeriod=15 Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805287 4937 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.805461 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805480 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.805510 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805518 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.805531 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805540 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.805547 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805555 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.805564 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805572 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.805583 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805591 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.805602 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805609 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.805623 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805631 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.805640 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805647 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805779 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805855 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805867 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805875 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805901 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805913 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.805966 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.806092 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.806104 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.806221 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.806235 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 15:51:59 crc kubenswrapper[4937]: E0225 15:51:59.862974 4937 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.130:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.973400 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.973453 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.973520 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.973559 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.973591 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.973610 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.973644 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:51:59 crc kubenswrapper[4937]: I0225 15:51:59.973686 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074285 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074352 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074389 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074411 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074442 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074505 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074539 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074562 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074563 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074643 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074695 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074651 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074743 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074750 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074784 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.074785 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.163638 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.618876 4937 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.618952 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.795522 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.797115 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.797801 4937 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52" exitCode=0 Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.797925 4937 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9" exitCode=0 Feb 25 15:52:00 crc kubenswrapper[4937]: I0225 15:52:00.798000 4937 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa" exitCode=2 Feb 25 15:52:01 crc kubenswrapper[4937]: I0225 15:52:01.377817 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:01 crc kubenswrapper[4937]: I0225 15:52:01.379394 4937 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:01 crc kubenswrapper[4937]: I0225 15:52:01.381514 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:01 crc kubenswrapper[4937]: I0225 15:52:01.616907 4937 scope.go:117] "RemoveContainer" containerID="d2c599db28b3599d1a3907d6cf44f0728dd9f31fdd7e2f554c9e80cda476b94c" Feb 25 15:52:01 crc kubenswrapper[4937]: I0225 15:52:01.806835 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 15:52:01 crc kubenswrapper[4937]: I0225 15:52:01.808247 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 25 15:52:01 crc kubenswrapper[4937]: I0225 15:52:01.808995 4937 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925" exitCode=0 Feb 25 15:52:01 crc kubenswrapper[4937]: I0225 15:52:01.810713 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d5b93a4226ebdac95ac553f51c1828b998152aa01ca3bcba0ffee99b5ab0e680"} Feb 25 15:52:01 crc kubenswrapper[4937]: I0225 15:52:01.814594 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9b212b58d3bd672202efa4f0c81067991fe1b6b72cf9217c687c8484f068b782"} Feb 25 15:52:01 crc kubenswrapper[4937]: I0225 15:52:01.816246 4937 generic.go:334] "Generic (PLEG): container finished" podID="e8624c64-7d59-4173-8928-e7dff50f1039" containerID="062bc97550bb9a8d6bb08e3a94bfe79a1a2302a233315a3830380be5d642fcd7" exitCode=0 Feb 25 15:52:01 crc kubenswrapper[4937]: I0225 15:52:01.816289 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e8624c64-7d59-4173-8928-e7dff50f1039","Type":"ContainerDied","Data":"062bc97550bb9a8d6bb08e3a94bfe79a1a2302a233315a3830380be5d642fcd7"} Feb 25 15:52:01 crc kubenswrapper[4937]: I0225 15:52:01.816852 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:01 crc kubenswrapper[4937]: I0225 15:52:01.817169 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:01 crc kubenswrapper[4937]: I0225 15:52:01.817683 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:02 crc kubenswrapper[4937]: I0225 15:52:02.827106 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 15:52:02 crc kubenswrapper[4937]: I0225 15:52:02.831088 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 25 15:52:02 crc kubenswrapper[4937]: I0225 15:52:02.831929 4937 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea" exitCode=0 Feb 25 15:52:06 crc kubenswrapper[4937]: E0225 15:52:06.114463 4937 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:06 crc kubenswrapper[4937]: E0225 15:52:06.115912 4937 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:06 crc kubenswrapper[4937]: E0225 15:52:06.116096 4937 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:06 crc kubenswrapper[4937]: E0225 15:52:06.116258 4937 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:06 crc kubenswrapper[4937]: E0225 15:52:06.116553 4937 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:06 crc kubenswrapper[4937]: I0225 15:52:06.116607 4937 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 25 15:52:06 crc kubenswrapper[4937]: E0225 15:52:06.117042 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="200ms" Feb 25 15:52:06 crc kubenswrapper[4937]: E0225 15:52:06.318250 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="400ms" Feb 25 15:52:06 crc kubenswrapper[4937]: E0225 15:52:06.719387 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="800ms" Feb 25 15:52:07 crc kubenswrapper[4937]: E0225 15:52:07.520864 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="1.6s" Feb 25 15:52:08 crc kubenswrapper[4937]: I0225 15:52:08.369255 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:08 crc kubenswrapper[4937]: I0225 15:52:08.370005 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:08 crc kubenswrapper[4937]: I0225 15:52:08.370469 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:08 crc kubenswrapper[4937]: I0225 15:52:08.370961 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:09 crc kubenswrapper[4937]: E0225 15:52:09.122061 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="3.2s" Feb 25 15:52:09 crc kubenswrapper[4937]: E0225 15:52:09.439534 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mrgxt" podUID="3c753535-03f4-4888-8e28-43b4924726ae" Feb 25 15:52:09 crc kubenswrapper[4937]: E0225 15:52:09.440201 4937 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events/redhat-marketplace-mrgxt.1897882683dfebfb\": dial tcp 38.102.83.130:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-mrgxt.1897882683dfebfb openshift-marketplace 29454 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-mrgxt,UID:3c753535-03f4-4888-8e28-43b4924726ae,APIVersion:v1,ResourceVersion:28627,FieldPath:spec.initContainers{extract-content},},Reason:BackOff,Message:Back-off pulling image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\",Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:51:17 +0000 UTC,LastTimestamp:2026-02-25 15:52:09.439450122 +0000 UTC m=+380.452842012,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.496645 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.497206 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.497823 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.498888 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.499359 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.679886 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8624c64-7d59-4173-8928-e7dff50f1039-var-lock\") pod \"e8624c64-7d59-4173-8928-e7dff50f1039\" (UID: \"e8624c64-7d59-4173-8928-e7dff50f1039\") " Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.680169 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8624c64-7d59-4173-8928-e7dff50f1039-var-lock" (OuterVolumeSpecName: "var-lock") pod "e8624c64-7d59-4173-8928-e7dff50f1039" (UID: "e8624c64-7d59-4173-8928-e7dff50f1039"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.680553 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8624c64-7d59-4173-8928-e7dff50f1039-kubelet-dir\") pod \"e8624c64-7d59-4173-8928-e7dff50f1039\" (UID: \"e8624c64-7d59-4173-8928-e7dff50f1039\") " Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.680634 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8624c64-7d59-4173-8928-e7dff50f1039-kube-api-access\") pod \"e8624c64-7d59-4173-8928-e7dff50f1039\" (UID: \"e8624c64-7d59-4173-8928-e7dff50f1039\") " Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.680667 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8624c64-7d59-4173-8928-e7dff50f1039-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e8624c64-7d59-4173-8928-e7dff50f1039" (UID: "e8624c64-7d59-4173-8928-e7dff50f1039"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.680961 4937 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8624c64-7d59-4173-8928-e7dff50f1039-var-lock\") on node \"crc\" DevicePath \"\"" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.680984 4937 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8624c64-7d59-4173-8928-e7dff50f1039-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.687304 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8624c64-7d59-4173-8928-e7dff50f1039-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e8624c64-7d59-4173-8928-e7dff50f1039" (UID: "e8624c64-7d59-4173-8928-e7dff50f1039"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.782897 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8624c64-7d59-4173-8928-e7dff50f1039-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.876175 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e8624c64-7d59-4173-8928-e7dff50f1039","Type":"ContainerDied","Data":"e0a0898652bff388c81683ce0ec24e45cf61aa40e4e64faf47834486db0cbd1c"} Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.876222 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0a0898652bff388c81683ce0ec24e45cf61aa40e4e64faf47834486db0cbd1c" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.876264 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.904170 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.904927 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.905762 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:09 crc kubenswrapper[4937]: I0225 15:52:09.906301 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:10 crc kubenswrapper[4937]: E0225 15:52:10.376655 4937 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 25 15:52:10 crc kubenswrapper[4937]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(a3d8cc22738e978e003328225f534afa17aedf3c0d2b42952f4d1fd3498da3d6): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a3d8cc22738e978e003328225f534afa17aedf3c0d2b42952f4d1fd3498da3d6" Netns:"/var/run/netns/bfe35e69-5dea-4188-b939-2ed36706e510" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=a3d8cc22738e978e003328225f534afa17aedf3c0d2b42952f4d1fd3498da3d6;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:52:10 crc kubenswrapper[4937]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 25 15:52:10 crc kubenswrapper[4937]: > Feb 25 15:52:10 crc kubenswrapper[4937]: E0225 15:52:10.376909 4937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 25 15:52:10 crc kubenswrapper[4937]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(a3d8cc22738e978e003328225f534afa17aedf3c0d2b42952f4d1fd3498da3d6): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a3d8cc22738e978e003328225f534afa17aedf3c0d2b42952f4d1fd3498da3d6" Netns:"/var/run/netns/bfe35e69-5dea-4188-b939-2ed36706e510" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=a3d8cc22738e978e003328225f534afa17aedf3c0d2b42952f4d1fd3498da3d6;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:52:10 crc kubenswrapper[4937]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 25 15:52:10 crc kubenswrapper[4937]: > pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:52:10 crc kubenswrapper[4937]: E0225 15:52:10.377061 4937 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 25 15:52:10 crc kubenswrapper[4937]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(a3d8cc22738e978e003328225f534afa17aedf3c0d2b42952f4d1fd3498da3d6): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a3d8cc22738e978e003328225f534afa17aedf3c0d2b42952f4d1fd3498da3d6" Netns:"/var/run/netns/bfe35e69-5dea-4188-b939-2ed36706e510" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=a3d8cc22738e978e003328225f534afa17aedf3c0d2b42952f4d1fd3498da3d6;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:52:10 crc kubenswrapper[4937]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 25 15:52:10 crc kubenswrapper[4937]: > pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:52:10 crc kubenswrapper[4937]: E0225 15:52:10.377330 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"networking-console-plugin-85b44fc459-gdk6g_openshift-network-console(5fe485a1-e14f-4c09-b5b9-f252bc42b7e8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"networking-console-plugin-85b44fc459-gdk6g_openshift-network-console(5fe485a1-e14f-4c09-b5b9-f252bc42b7e8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(a3d8cc22738e978e003328225f534afa17aedf3c0d2b42952f4d1fd3498da3d6): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"a3d8cc22738e978e003328225f534afa17aedf3c0d2b42952f4d1fd3498da3d6\\\" Netns:\\\"/var/run/netns/bfe35e69-5dea-4188-b939-2ed36706e510\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=a3d8cc22738e978e003328225f534afa17aedf3c0d2b42952f4d1fd3498da3d6;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s\\\": dial tcp 38.102.83.130:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:52:11 crc kubenswrapper[4937]: I0225 15:52:11.372197 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:11 crc kubenswrapper[4937]: I0225 15:52:11.372896 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:11 crc kubenswrapper[4937]: I0225 15:52:11.373471 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:11 crc kubenswrapper[4937]: I0225 15:52:11.373919 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:12 crc kubenswrapper[4937]: E0225 15:52:12.322972 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="6.4s" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.274337 4937 scope.go:117] "RemoveContainer" containerID="254d37e2a8361ca3a196ad356445a1d3c07eb809914a428bc5b0591de538480b" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.349581 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.352820 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.354701 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.355705 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.356269 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.357311 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.358012 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.358537 4937 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.437848 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.437964 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.438001 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.438083 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.438160 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.438265 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.438450 4937 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.438476 4937 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.438506 4937 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.905323 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.907224 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.908373 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.909225 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.909574 4937 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.910035 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.911153 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.911700 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.938371 4937 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.938847 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.939323 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.939663 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:13 crc kubenswrapper[4937]: I0225 15:52:13.940089 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:14 crc kubenswrapper[4937]: I0225 15:52:14.804450 4937 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 25 15:52:14 crc kubenswrapper[4937]: I0225 15:52:14.804586 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 25 15:52:14 crc kubenswrapper[4937]: I0225 15:52:14.920701 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 25 15:52:14 crc kubenswrapper[4937]: I0225 15:52:14.921630 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 25 15:52:14 crc kubenswrapper[4937]: I0225 15:52:14.921708 4937 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018" exitCode=1 Feb 25 15:52:14 crc kubenswrapper[4937]: I0225 15:52:14.921758 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018"} Feb 25 15:52:14 crc kubenswrapper[4937]: I0225 15:52:14.922636 4937 scope.go:117] "RemoveContainer" containerID="edbea74eed0f8767f6ec630d046576fc904c9a719a93115ee029680866161018" Feb 25 15:52:14 crc kubenswrapper[4937]: I0225 15:52:14.923124 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:14 crc kubenswrapper[4937]: I0225 15:52:14.923904 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:14 crc kubenswrapper[4937]: I0225 15:52:14.924638 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:14 crc kubenswrapper[4937]: I0225 15:52:14.925024 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:14 crc kubenswrapper[4937]: I0225 15:52:14.925662 4937 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:14 crc kubenswrapper[4937]: I0225 15:52:14.926195 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:15 crc kubenswrapper[4937]: I0225 15:52:15.378749 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 25 15:52:15 crc kubenswrapper[4937]: I0225 15:52:15.743550 4937 scope.go:117] "RemoveContainer" containerID="6fcb26a16d5d62bcc02d7506965dab9a5b707774bcab2a63cf1cfbd03df8e044" Feb 25 15:52:15 crc kubenswrapper[4937]: I0225 15:52:15.874631 4937 scope.go:117] "RemoveContainer" containerID="b0a3761169e64f2100446d82ebf67a7d7bdb0d3e34358469dbbe32515344cb77" Feb 25 15:52:15 crc kubenswrapper[4937]: I0225 15:52:15.936831 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"295c60618ac8d3aab2e724393a83fcee2406df13057680c2aa5339f1a1205c15"} Feb 25 15:52:15 crc kubenswrapper[4937]: I0225 15:52:15.965539 4937 scope.go:117] "RemoveContainer" containerID="b462f62dccd597bb271b38ef29f98d6f95a14725022e0cdf0eaa0dbf13e72c77" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.027806 4937 scope.go:117] "RemoveContainer" containerID="74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.061092 4937 scope.go:117] "RemoveContainer" containerID="83c96e263a1df1a6d5c0c699483c7d6290bd2b471ce7d99674832f9b78d83a52" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.084726 4937 scope.go:117] "RemoveContainer" containerID="74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527" Feb 25 15:52:16 crc kubenswrapper[4937]: E0225 15:52:16.085400 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\": container with ID starting with 74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527 not found: ID does not exist" containerID="74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.085450 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527"} err="failed to get container status \"74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\": rpc error: code = NotFound desc = could not find container \"74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527\": container with ID starting with 74f7723805d627acbf88464fd465bfd37c677a61dc467337358e83fe26e8b527 not found: ID does not exist" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.085480 4937 scope.go:117] "RemoveContainer" containerID="00794fa7295a712ec126691108250fb45c6f2e649d7bf7002d591b2e1fc492d9" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.112282 4937 scope.go:117] "RemoveContainer" containerID="38477251fb95cbab48e740dc87a814cb51a1680f24dd6a73a8a844787a8ed925" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.131421 4937 scope.go:117] "RemoveContainer" containerID="8646599e9ff593c5a8b766455ad2499b97995d9dd3e0b78e54102edc09b259aa" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.143799 4937 scope.go:117] "RemoveContainer" containerID="c6e9a7961fed7ced7c0d98534f83cedd7d6bf503f4380475a2146b8960402bea" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.155988 4937 scope.go:117] "RemoveContainer" containerID="20fe2a3207492308d4ec0ae716328044ff3e0d02171961e548e7c385e4ee408c" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.946671 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54sqd" event={"ID":"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c","Type":"ContainerStarted","Data":"9f1a8301500e621cbb777d1bdbf3e0d51d4638711e48923cf27422aa55f63267"} Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.949303 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9tlm" event={"ID":"fae7d336-b701-4174-bdae-bd3f1bc032b1","Type":"ContainerStarted","Data":"088f18cd4ff4698383b7079b3ab76ce7b26f4672d20851c04010cccfc4589231"} Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.953714 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsxxs" event={"ID":"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b","Type":"ContainerStarted","Data":"ce1478d91047d9edfb30553059efd253e36a6dd4b8e10817ae72246d156f6666"} Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.961326 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scndr" event={"ID":"dc970acf-3cdb-4951-8f35-705ce003550f","Type":"ContainerStarted","Data":"af08644e562333e00be97dae4e3a62ec89446ec0ceeda29e11bec5daa22f4d10"} Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.964318 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2f59940a441b8bf7c213dadb16cfdacb7fdd87ff4b235b1a6339a8b538eed79e"} Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.965750 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cad7e6a23e590abd58d2097fa0e176b40c16ac869fa181dd265b350f6856a811"} Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.966634 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.966938 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.967203 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.967458 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.967727 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.969868 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.970541 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.970713 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0c5350ff6076f8ef742144e818bb42f65bbf08247696322b60261b220669d0b4"} Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.974870 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"be36bbb4cb7caae3cffeabae0872a130cb578c2c14f08050823ce37f503bd174"} Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.975093 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.975828 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.976567 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.977176 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8xkp" event={"ID":"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15","Type":"ContainerStarted","Data":"987763612bb0738cda4ef2c12210a475f4d3e910f1a99c8e8231554980baa657"} Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.977252 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.977679 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:16 crc kubenswrapper[4937]: I0225 15:52:16.978017 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:17 crc kubenswrapper[4937]: E0225 15:52:17.192947 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:17Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:17Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:17Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:17Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a06f213440be91e4fe4375bf013482430fdc12ea74ae3dbd86ab47ac200c9f64\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:b71b2b8c5aa2fddc051b7076112140a82be26b0dadfb4be056616ce64fde2806\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252065488},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0ebd2888f76603b9f1c745febc543c073cdd98b7ad625130eafc304e107db3e1\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:5ca0241f2bf3e934a8a8654751fda7faccea72ab876cd99249edba8a27363486\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1215337671},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:17 crc kubenswrapper[4937]: E0225 15:52:17.193442 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:17 crc kubenswrapper[4937]: E0225 15:52:17.193710 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:17 crc kubenswrapper[4937]: E0225 15:52:17.194112 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:17 crc kubenswrapper[4937]: E0225 15:52:17.194656 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:17 crc kubenswrapper[4937]: E0225 15:52:17.194677 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.984405 4937 generic.go:334] "Generic (PLEG): container finished" podID="fae7d336-b701-4174-bdae-bd3f1bc032b1" containerID="088f18cd4ff4698383b7079b3ab76ce7b26f4672d20851c04010cccfc4589231" exitCode=0 Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.984521 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9tlm" event={"ID":"fae7d336-b701-4174-bdae-bd3f1bc032b1","Type":"ContainerDied","Data":"088f18cd4ff4698383b7079b3ab76ce7b26f4672d20851c04010cccfc4589231"} Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.986620 4937 generic.go:334] "Generic (PLEG): container finished" podID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" containerID="ce1478d91047d9edfb30553059efd253e36a6dd4b8e10817ae72246d156f6666" exitCode=0 Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.986692 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsxxs" event={"ID":"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b","Type":"ContainerDied","Data":"ce1478d91047d9edfb30553059efd253e36a6dd4b8e10817ae72246d156f6666"} Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.988437 4937 generic.go:334] "Generic (PLEG): container finished" podID="dc970acf-3cdb-4951-8f35-705ce003550f" containerID="af08644e562333e00be97dae4e3a62ec89446ec0ceeda29e11bec5daa22f4d10" exitCode=0 Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.988528 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scndr" event={"ID":"dc970acf-3cdb-4951-8f35-705ce003550f","Type":"ContainerDied","Data":"af08644e562333e00be97dae4e3a62ec89446ec0ceeda29e11bec5daa22f4d10"} Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.990537 4937 generic.go:334] "Generic (PLEG): container finished" podID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" containerID="987763612bb0738cda4ef2c12210a475f4d3e910f1a99c8e8231554980baa657" exitCode=0 Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.990625 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8xkp" event={"ID":"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15","Type":"ContainerDied","Data":"987763612bb0738cda4ef2c12210a475f4d3e910f1a99c8e8231554980baa657"} Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.992369 4937 generic.go:334] "Generic (PLEG): container finished" podID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" containerID="9f1a8301500e621cbb777d1bdbf3e0d51d4638711e48923cf27422aa55f63267" exitCode=0 Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.993609 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54sqd" event={"ID":"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c","Type":"ContainerDied","Data":"9f1a8301500e621cbb777d1bdbf3e0d51d4638711e48923cf27422aa55f63267"} Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.994580 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.995236 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.995762 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.996302 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.996764 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:17 crc kubenswrapper[4937]: I0225 15:52:17.997338 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:18 crc kubenswrapper[4937]: I0225 15:52:18.559140 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:52:18 crc kubenswrapper[4937]: E0225 15:52:18.724736 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="7s" Feb 25 15:52:18 crc kubenswrapper[4937]: E0225 15:52:18.805257 4937 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events/redhat-marketplace-mrgxt.1897882683dfebfb\": dial tcp 38.102.83.130:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-mrgxt.1897882683dfebfb openshift-marketplace 29454 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-mrgxt,UID:3c753535-03f4-4888-8e28-43b4924726ae,APIVersion:v1,ResourceVersion:28627,FieldPath:spec.initContainers{extract-content},},Reason:BackOff,Message:Back-off pulling image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\",Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:51:17 +0000 UTC,LastTimestamp:2026-02-25 15:52:09.439450122 +0000 UTC m=+380.452842012,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.031407 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.031770 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.032048 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.032296 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.032588 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.032867 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.033416 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: E0225 15:52:19.033460 4937 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.130:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.033805 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.034141 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.034460 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.035949 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.037012 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.038451 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.038866 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.039407 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.040148 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:19 crc kubenswrapper[4937]: I0225 15:52:19.040592 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.380242 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.380978 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.381362 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.381651 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.381858 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.382053 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.382274 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.382541 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.382776 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.383087 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.383427 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.383772 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.384130 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.384607 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.384972 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.385218 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.385451 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.385816 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.386169 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:21 crc kubenswrapper[4937]: I0225 15:52:21.386445 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.052594 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.052909 4937 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="cad7e6a23e590abd58d2097fa0e176b40c16ac869fa181dd265b350f6856a811" exitCode=255 Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.052940 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"cad7e6a23e590abd58d2097fa0e176b40c16ac869fa181dd265b350f6856a811"} Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.053372 4937 scope.go:117] "RemoveContainer" containerID="cad7e6a23e590abd58d2097fa0e176b40c16ac869fa181dd265b350f6856a811" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.053778 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.054130 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.054303 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.054537 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.054900 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.055250 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.055568 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.055752 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.056015 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.056348 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.888802 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.895255 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.895658 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.895934 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.896100 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.896240 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.896377 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.896536 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.896761 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.897040 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.897277 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:22 crc kubenswrapper[4937]: I0225 15:52:22.897547 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:23 crc kubenswrapper[4937]: I0225 15:52:23.058074 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.074023 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.074451 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"76ff5da83007eb57f17a354fea6a2b0441a4b7a037f0156b3f11e71ae5503482"} Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.366976 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.367071 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.367611 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.368072 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.368535 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.368943 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.369347 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.369809 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.370198 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.370662 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.371278 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.371917 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.372369 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.384455 4937 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8ddf09d5-0ab8-4bb9-a321-b1a29590b29f" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.384578 4937 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8ddf09d5-0ab8-4bb9-a321-b1a29590b29f" Feb 25 15:52:25 crc kubenswrapper[4937]: E0225 15:52:25.385175 4937 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:25 crc kubenswrapper[4937]: I0225 15:52:25.385816 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:25 crc kubenswrapper[4937]: E0225 15:52:25.726589 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="7s" Feb 25 15:52:25 crc kubenswrapper[4937]: E0225 15:52:25.953173 4937 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-76ff5da83007eb57f17a354fea6a2b0441a4b7a037f0156b3f11e71ae5503482.scope\": RecentStats: unable to find data in memory cache]" Feb 25 15:52:26 crc kubenswrapper[4937]: I0225 15:52:26.083022 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:26 crc kubenswrapper[4937]: I0225 15:52:26.084414 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:26 crc kubenswrapper[4937]: I0225 15:52:26.085062 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:26 crc kubenswrapper[4937]: I0225 15:52:26.085607 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:26 crc kubenswrapper[4937]: I0225 15:52:26.086211 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:26 crc kubenswrapper[4937]: I0225 15:52:26.086739 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:26 crc kubenswrapper[4937]: I0225 15:52:26.087184 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:26 crc kubenswrapper[4937]: I0225 15:52:26.087614 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:26 crc kubenswrapper[4937]: I0225 15:52:26.088098 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:26 crc kubenswrapper[4937]: I0225 15:52:26.088691 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:27 crc kubenswrapper[4937]: E0225 15:52:27.211139 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:27Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:27Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:27Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:27Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a06f213440be91e4fe4375bf013482430fdc12ea74ae3dbd86ab47ac200c9f64\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:b71b2b8c5aa2fddc051b7076112140a82be26b0dadfb4be056616ce64fde2806\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252065488},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0ebd2888f76603b9f1c745febc543c073cdd98b7ad625130eafc304e107db3e1\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:5ca0241f2bf3e934a8a8654751fda7faccea72ab876cd99249edba8a27363486\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1215337671},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:27 crc kubenswrapper[4937]: E0225 15:52:27.212553 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:27 crc kubenswrapper[4937]: E0225 15:52:27.213129 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:27 crc kubenswrapper[4937]: E0225 15:52:27.213552 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:27 crc kubenswrapper[4937]: E0225 15:52:27.214088 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:27 crc kubenswrapper[4937]: E0225 15:52:27.214125 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.100800 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.101751 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.101815 4937 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="76ff5da83007eb57f17a354fea6a2b0441a4b7a037f0156b3f11e71ae5503482" exitCode=255 Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.101866 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"76ff5da83007eb57f17a354fea6a2b0441a4b7a037f0156b3f11e71ae5503482"} Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.101915 4937 scope.go:117] "RemoveContainer" containerID="cad7e6a23e590abd58d2097fa0e176b40c16ac869fa181dd265b350f6856a811" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.102567 4937 scope.go:117] "RemoveContainer" containerID="76ff5da83007eb57f17a354fea6a2b0441a4b7a037f0156b3f11e71ae5503482" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.102684 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: E0225 15:52:28.102883 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.103137 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.103755 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.104086 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.104624 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.105188 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.105663 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.105999 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.106466 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.107040 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.563748 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.564847 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.565460 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.566083 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.566603 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.567113 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.567586 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.568090 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.568558 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.568970 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: I0225 15:52:28.569379 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:28 crc kubenswrapper[4937]: E0225 15:52:28.806470 4937 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events/redhat-marketplace-mrgxt.1897882683dfebfb\": dial tcp 38.102.83.130:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-mrgxt.1897882683dfebfb openshift-marketplace 29454 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-mrgxt,UID:3c753535-03f4-4888-8e28-43b4924726ae,APIVersion:v1,ResourceVersion:28627,FieldPath:spec.initContainers{extract-content},},Reason:BackOff,Message:Back-off pulling image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\",Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:51:17 +0000 UTC,LastTimestamp:2026-02-25 15:52:09.439450122 +0000 UTC m=+380.452842012,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:52:31 crc kubenswrapper[4937]: I0225 15:52:31.376577 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:31 crc kubenswrapper[4937]: I0225 15:52:31.377419 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:31 crc kubenswrapper[4937]: I0225 15:52:31.377906 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:31 crc kubenswrapper[4937]: I0225 15:52:31.378286 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:31 crc kubenswrapper[4937]: I0225 15:52:31.378684 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:31 crc kubenswrapper[4937]: I0225 15:52:31.379168 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:31 crc kubenswrapper[4937]: I0225 15:52:31.379700 4937 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:31 crc kubenswrapper[4937]: I0225 15:52:31.380052 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:31 crc kubenswrapper[4937]: I0225 15:52:31.380376 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:31 crc kubenswrapper[4937]: I0225 15:52:31.380715 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:31 crc kubenswrapper[4937]: I0225 15:52:31.381022 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:32 crc kubenswrapper[4937]: E0225 15:52:32.729233 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="7s" Feb 25 15:52:33 crc kubenswrapper[4937]: I0225 15:52:33.135467 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Feb 25 15:52:37 crc kubenswrapper[4937]: E0225 15:52:37.484201 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:37Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:37Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:37Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:37Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a06f213440be91e4fe4375bf013482430fdc12ea74ae3dbd86ab47ac200c9f64\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:b71b2b8c5aa2fddc051b7076112140a82be26b0dadfb4be056616ce64fde2806\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252065488},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0ebd2888f76603b9f1c745febc543c073cdd98b7ad625130eafc304e107db3e1\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:5ca0241f2bf3e934a8a8654751fda7faccea72ab876cd99249edba8a27363486\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1215337671},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:37 crc kubenswrapper[4937]: E0225 15:52:37.485261 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:37 crc kubenswrapper[4937]: E0225 15:52:37.485625 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:37 crc kubenswrapper[4937]: E0225 15:52:37.486098 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:37 crc kubenswrapper[4937]: E0225 15:52:37.486670 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:37 crc kubenswrapper[4937]: E0225 15:52:37.486699 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:52:38 crc kubenswrapper[4937]: E0225 15:52:38.808109 4937 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events/redhat-marketplace-mrgxt.1897882683dfebfb\": dial tcp 38.102.83.130:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-mrgxt.1897882683dfebfb openshift-marketplace 29454 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-mrgxt,UID:3c753535-03f4-4888-8e28-43b4924726ae,APIVersion:v1,ResourceVersion:28627,FieldPath:spec.initContainers{extract-content},},Reason:BackOff,Message:Back-off pulling image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\",Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 15:51:17 +0000 UTC,LastTimestamp:2026-02-25 15:52:09.439450122 +0000 UTC m=+380.452842012,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 15:52:39 crc kubenswrapper[4937]: E0225 15:52:39.730910 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="7s" Feb 25 15:52:40 crc kubenswrapper[4937]: I0225 15:52:40.368006 4937 scope.go:117] "RemoveContainer" containerID="76ff5da83007eb57f17a354fea6a2b0441a4b7a037f0156b3f11e71ae5503482" Feb 25 15:52:40 crc kubenswrapper[4937]: I0225 15:52:40.368175 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:40 crc kubenswrapper[4937]: I0225 15:52:40.369189 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:40 crc kubenswrapper[4937]: I0225 15:52:40.370294 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:40 crc kubenswrapper[4937]: I0225 15:52:40.370810 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:40 crc kubenswrapper[4937]: I0225 15:52:40.371455 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:40 crc kubenswrapper[4937]: I0225 15:52:40.372033 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:40 crc kubenswrapper[4937]: I0225 15:52:40.372459 4937 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:40 crc kubenswrapper[4937]: I0225 15:52:40.373009 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:40 crc kubenswrapper[4937]: I0225 15:52:40.373652 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:40 crc kubenswrapper[4937]: I0225 15:52:40.374238 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:40 crc kubenswrapper[4937]: I0225 15:52:40.374795 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:41 crc kubenswrapper[4937]: I0225 15:52:41.369688 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:41 crc kubenswrapper[4937]: I0225 15:52:41.369893 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:41 crc kubenswrapper[4937]: I0225 15:52:41.370037 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:41 crc kubenswrapper[4937]: I0225 15:52:41.370463 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:41 crc kubenswrapper[4937]: I0225 15:52:41.371682 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:41 crc kubenswrapper[4937]: I0225 15:52:41.372117 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:41 crc kubenswrapper[4937]: I0225 15:52:41.372436 4937 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:41 crc kubenswrapper[4937]: I0225 15:52:41.372830 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:41 crc kubenswrapper[4937]: I0225 15:52:41.373087 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:41 crc kubenswrapper[4937]: I0225 15:52:41.373388 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:41 crc kubenswrapper[4937]: I0225 15:52:41.373964 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.200469 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-vrzqb_ef543e1b-8068-4ea3-b32a-61027b32e95d/approver/0.log" Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.201150 4937 generic.go:334] "Generic (PLEG): container finished" podID="ef543e1b-8068-4ea3-b32a-61027b32e95d" containerID="cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15" exitCode=1 Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.201209 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerDied","Data":"cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15"} Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.201837 4937 scope.go:117] "RemoveContainer" containerID="cbfc163d9968791da089eb7bdaeed956776c9589cf4e1996d7fe60f701ac6f15" Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.202564 4937 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.203137 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.203711 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.204114 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.204590 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.204906 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.205298 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.205739 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.206136 4937 status_manager.go:851] "Failed to get status for pod" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-node-identity/pods/network-node-identity-vrzqb\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.206562 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.206936 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:42 crc kubenswrapper[4937]: I0225 15:52:42.207326 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:44 crc kubenswrapper[4937]: W0225 15:52:44.252935 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-070144e81a44e920612b118ddd983552dfad29ab1142eff2a881e825928cb6a5 WatchSource:0}: Error finding container 070144e81a44e920612b118ddd983552dfad29ab1142eff2a881e825928cb6a5: Status 404 returned error can't find the container with id 070144e81a44e920612b118ddd983552dfad29ab1142eff2a881e825928cb6a5 Feb 25 15:52:44 crc kubenswrapper[4937]: E0225 15:52:44.911905 4937 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 25 15:52:44 crc kubenswrapper[4937]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(575552c14052535d1a6b50e212b794209c135e57ad2c28ac23da38bb5dd5dd13): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"575552c14052535d1a6b50e212b794209c135e57ad2c28ac23da38bb5dd5dd13" Netns:"/var/run/netns/95082dca-bfb4-4359-804e-db2d42045f2b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=575552c14052535d1a6b50e212b794209c135e57ad2c28ac23da38bb5dd5dd13;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:52:44 crc kubenswrapper[4937]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 25 15:52:44 crc kubenswrapper[4937]: > Feb 25 15:52:44 crc kubenswrapper[4937]: E0225 15:52:44.912424 4937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 25 15:52:44 crc kubenswrapper[4937]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(575552c14052535d1a6b50e212b794209c135e57ad2c28ac23da38bb5dd5dd13): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"575552c14052535d1a6b50e212b794209c135e57ad2c28ac23da38bb5dd5dd13" Netns:"/var/run/netns/95082dca-bfb4-4359-804e-db2d42045f2b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=575552c14052535d1a6b50e212b794209c135e57ad2c28ac23da38bb5dd5dd13;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:52:44 crc kubenswrapper[4937]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 25 15:52:44 crc kubenswrapper[4937]: > pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:52:44 crc kubenswrapper[4937]: E0225 15:52:44.912453 4937 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 25 15:52:44 crc kubenswrapper[4937]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(575552c14052535d1a6b50e212b794209c135e57ad2c28ac23da38bb5dd5dd13): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"575552c14052535d1a6b50e212b794209c135e57ad2c28ac23da38bb5dd5dd13" Netns:"/var/run/netns/95082dca-bfb4-4359-804e-db2d42045f2b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=575552c14052535d1a6b50e212b794209c135e57ad2c28ac23da38bb5dd5dd13;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s": dial tcp 38.102.83.130:6443: connect: connection refused Feb 25 15:52:44 crc kubenswrapper[4937]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 25 15:52:44 crc kubenswrapper[4937]: > pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:52:44 crc kubenswrapper[4937]: E0225 15:52:44.912540 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"networking-console-plugin-85b44fc459-gdk6g_openshift-network-console(5fe485a1-e14f-4c09-b5b9-f252bc42b7e8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"networking-console-plugin-85b44fc459-gdk6g_openshift-network-console(5fe485a1-e14f-4c09-b5b9-f252bc42b7e8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(575552c14052535d1a6b50e212b794209c135e57ad2c28ac23da38bb5dd5dd13): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"575552c14052535d1a6b50e212b794209c135e57ad2c28ac23da38bb5dd5dd13\\\" Netns:\\\"/var/run/netns/95082dca-bfb4-4359-804e-db2d42045f2b\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=575552c14052535d1a6b50e212b794209c135e57ad2c28ac23da38bb5dd5dd13;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s\\\": dial tcp 38.102.83.130:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 15:52:45 crc kubenswrapper[4937]: I0225 15:52:45.225761 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"070144e81a44e920612b118ddd983552dfad29ab1142eff2a881e825928cb6a5"} Feb 25 15:52:46 crc kubenswrapper[4937]: E0225 15:52:46.732472 4937 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="7s" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.239993 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsxxs" event={"ID":"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b","Type":"ContainerStarted","Data":"ee3fb963f39141446a76e27da17b6222c78c082df995f1bf53363abfd73aebb0"} Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.241098 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.241412 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.241657 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.242021 4937 status_manager.go:851] "Failed to get status for pod" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-node-identity/pods/network-node-identity-vrzqb\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.242256 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.242512 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.242793 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.242794 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54sqd" event={"ID":"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c","Type":"ContainerStarted","Data":"388d7566d873d97ed691ebcb5de6c17076905eda19ee0aee5fc9fdb5f629be47"} Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.243031 4937 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.243266 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.243530 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.243754 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.243967 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.244222 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.244403 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.244554 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.244634 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.244644 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5dffc17bc6cc37e12016244f182206403c216c983397f2eecd4bf31a4b5c40d6"} Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.244863 4937 status_manager.go:851] "Failed to get status for pod" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-node-identity/pods/network-node-identity-vrzqb\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.245088 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.245386 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.245934 4937 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.246213 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.246448 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.246677 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.246767 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-vrzqb_ef543e1b-8068-4ea3-b32a-61027b32e95d/approver/0.log" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.246916 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.247054 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5eb504726e451a38f302440c222b8eebd6f4572431cc4388efb2fbf83a3723fd"} Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.247126 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.247361 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.247561 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.248053 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.248313 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.248665 4937 status_manager.go:851] "Failed to get status for pod" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-node-identity/pods/network-node-identity-vrzqb\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.248989 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.249240 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.249535 4937 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.249857 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.250170 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.250416 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9tlm" event={"ID":"fae7d336-b701-4174-bdae-bd3f1bc032b1","Type":"ContainerStarted","Data":"fc6791fd338d3867814b729210187a036b40162992c11d395455abbba0bd0686"} Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.250589 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.250965 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.251229 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.251547 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.251751 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.251952 4937 status_manager.go:851] "Failed to get status for pod" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-node-identity/pods/network-node-identity-vrzqb\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.252202 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.252529 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.252674 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scndr" event={"ID":"dc970acf-3cdb-4951-8f35-705ce003550f","Type":"ContainerStarted","Data":"c17984d83e8adf1748985ec49508a7289a7aee5dcf65ab112c2b756d0df53d3a"} Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.252793 4937 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.253075 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.253340 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.253561 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.253784 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.254024 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.254400 4937 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.254638 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.256639 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.256850 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.256872 4937 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="f131de630def57e28f8dda872b75d282e335d7f67e52c229822d9ac6db16873d" exitCode=0 Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.256932 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"f131de630def57e28f8dda872b75d282e335d7f67e52c229822d9ac6db16873d"} Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.257186 4937 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8ddf09d5-0ab8-4bb9-a321-b1a29590b29f" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.257219 4937 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8ddf09d5-0ab8-4bb9-a321-b1a29590b29f" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.257232 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.257452 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: E0225 15:52:47.257656 4937 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.257667 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.258087 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.258659 4937 status_manager.go:851] "Failed to get status for pod" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-node-identity/pods/network-node-identity-vrzqb\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.258865 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.259030 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.259333 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8xkp" event={"ID":"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15","Type":"ContainerStarted","Data":"35a8ce8d7df5f75b07fc4c92a504fb00daf584ba25f36d27f4257af87a40d6e9"} Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.260210 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.260931 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.261275 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.261637 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.261995 4937 status_manager.go:851] "Failed to get status for pod" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-node-identity/pods/network-node-identity-vrzqb\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.262190 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.262605 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.262647 4937 generic.go:334] "Generic (PLEG): container finished" podID="3c753535-03f4-4888-8e28-43b4924726ae" containerID="acc0809d2132a187122f0db7ee87b36ce597ec52eb14bf66aa6cc98b7afd7dbc" exitCode=0 Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.262675 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrgxt" event={"ID":"3c753535-03f4-4888-8e28-43b4924726ae","Type":"ContainerDied","Data":"acc0809d2132a187122f0db7ee87b36ce597ec52eb14bf66aa6cc98b7afd7dbc"} Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.262906 4937 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.263315 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.264559 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.264826 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.265420 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.265639 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.265911 4937 status_manager.go:851] "Failed to get status for pod" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" pod="openshift-marketplace/redhat-operators-scndr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-scndr\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.266175 4937 status_manager.go:851] "Failed to get status for pod" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-node-identity/pods/network-node-identity-vrzqb\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.266373 4937 status_manager.go:851] "Failed to get status for pod" podUID="3c753535-03f4-4888-8e28-43b4924726ae" pod="openshift-marketplace/redhat-marketplace-mrgxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mrgxt\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.266544 4937 status_manager.go:851] "Failed to get status for pod" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.266694 4937 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.266829 4937 status_manager.go:851] "Failed to get status for pod" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.266963 4937 status_manager.go:851] "Failed to get status for pod" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" pod="openshift-marketplace/community-operators-l8xkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-l8xkp\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.267102 4937 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.267366 4937 status_manager.go:851] "Failed to get status for pod" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" pod="openshift-marketplace/redhat-marketplace-gsxxs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-gsxxs\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.267712 4937 status_manager.go:851] "Failed to get status for pod" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" pod="openshift-marketplace/redhat-operators-l9tlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l9tlm\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.268014 4937 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.272243 4937 status_manager.go:851] "Failed to get status for pod" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" pod="openshift-marketplace/certified-operators-54sqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-54sqd\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.304017 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:52:47 crc kubenswrapper[4937]: I0225 15:52:47.304080 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:52:47 crc kubenswrapper[4937]: E0225 15:52:47.527207 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:47Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:47Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:47Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T15:52:47Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:03f69b52171163369470deb6ce032e0bd126464f8cc8022f5e1303d5274fdd0c\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:feff3a398777e802c46e4e418bb7fc075351f1d4d3e8f3c832c8c264049990e3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1705397734},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a06f213440be91e4fe4375bf013482430fdc12ea74ae3dbd86ab47ac200c9f64\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:b71b2b8c5aa2fddc051b7076112140a82be26b0dadfb4be056616ce64fde2806\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252065488},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:4ee88c4e8eb75250b790b71866ff46a1475017dab792ea8fdc00df7a47d000f6\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b059262657a9e400dc8ada3350ae31ac6521640d2732574713de898875597a1a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1216721618},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0ebd2888f76603b9f1c745febc543c073cdd98b7ad625130eafc304e107db3e1\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:5ca0241f2bf3e934a8a8654751fda7faccea72ab876cd99249edba8a27363486\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1215337671},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: E0225 15:52:47.527567 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: E0225 15:52:47.527965 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: E0225 15:52:47.528168 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: E0225 15:52:47.528374 4937 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Feb 25 15:52:47 crc kubenswrapper[4937]: E0225 15:52:47.528422 4937 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 15:52:48 crc kubenswrapper[4937]: I0225 15:52:48.271381 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4f2bd1a5d74988702fe517e7c303b4777f290b5ba02703e238855aff7c9917c6"} Feb 25 15:52:48 crc kubenswrapper[4937]: I0225 15:52:48.271756 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2cd27cdda96c9a4cc429a22b156360888725556f20f8464716934a57315d5a42"} Feb 25 15:52:48 crc kubenswrapper[4937]: I0225 15:52:48.271773 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"49f04b467d38246ce81ac57271b35030cab0b6a327c0087b8c86b1eed7291fc4"} Feb 25 15:52:48 crc kubenswrapper[4937]: I0225 15:52:48.291553 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:52:48 crc kubenswrapper[4937]: I0225 15:52:48.291622 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:52:48 crc kubenswrapper[4937]: I0225 15:52:48.339545 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 15:52:48 crc kubenswrapper[4937]: I0225 15:52:48.344473 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-gsxxs" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" containerName="registry-server" probeResult="failure" output=< Feb 25 15:52:48 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 15:52:48 crc kubenswrapper[4937]: > Feb 25 15:52:48 crc kubenswrapper[4937]: I0225 15:52:48.732640 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:52:48 crc kubenswrapper[4937]: I0225 15:52:48.732688 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:52:49 crc kubenswrapper[4937]: I0225 15:52:49.277566 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrgxt" event={"ID":"3c753535-03f4-4888-8e28-43b4924726ae","Type":"ContainerStarted","Data":"447b39b9ecee8c2bc457edcffb99f49b31b65646338fb309426fd252f5c8d027"} Feb 25 15:52:49 crc kubenswrapper[4937]: I0225 15:52:49.278969 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Feb 25 15:52:49 crc kubenswrapper[4937]: I0225 15:52:49.279328 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Feb 25 15:52:49 crc kubenswrapper[4937]: I0225 15:52:49.279370 4937 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="5dffc17bc6cc37e12016244f182206403c216c983397f2eecd4bf31a4b5c40d6" exitCode=255 Feb 25 15:52:49 crc kubenswrapper[4937]: I0225 15:52:49.279429 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"5dffc17bc6cc37e12016244f182206403c216c983397f2eecd4bf31a4b5c40d6"} Feb 25 15:52:49 crc kubenswrapper[4937]: I0225 15:52:49.279458 4937 scope.go:117] "RemoveContainer" containerID="76ff5da83007eb57f17a354fea6a2b0441a4b7a037f0156b3f11e71ae5503482" Feb 25 15:52:49 crc kubenswrapper[4937]: I0225 15:52:49.279764 4937 scope.go:117] "RemoveContainer" containerID="5dffc17bc6cc37e12016244f182206403c216c983397f2eecd4bf31a4b5c40d6" Feb 25 15:52:49 crc kubenswrapper[4937]: E0225 15:52:49.279928 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:52:49 crc kubenswrapper[4937]: I0225 15:52:49.282548 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2889cdf4479f7bff4cefcef8ad38aa864488cfeb923286a27ac157b77fa812a2"} Feb 25 15:52:49 crc kubenswrapper[4937]: I0225 15:52:49.282588 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1a1b37661e1c57bfab0827d582116cecc4a00261f0954c9b6fa311d32eb349ca"} Feb 25 15:52:49 crc kubenswrapper[4937]: I0225 15:52:49.330264 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-scndr" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" containerName="registry-server" probeResult="failure" output=< Feb 25 15:52:49 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 15:52:49 crc kubenswrapper[4937]: > Feb 25 15:52:49 crc kubenswrapper[4937]: I0225 15:52:49.778818 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l9tlm" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" containerName="registry-server" probeResult="failure" output=< Feb 25 15:52:49 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 15:52:49 crc kubenswrapper[4937]: > Feb 25 15:52:50 crc kubenswrapper[4937]: I0225 15:52:50.288478 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Feb 25 15:52:50 crc kubenswrapper[4937]: I0225 15:52:50.288748 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:50 crc kubenswrapper[4937]: I0225 15:52:50.288770 4937 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8ddf09d5-0ab8-4bb9-a321-b1a29590b29f" Feb 25 15:52:50 crc kubenswrapper[4937]: I0225 15:52:50.288787 4937 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8ddf09d5-0ab8-4bb9-a321-b1a29590b29f" Feb 25 15:52:50 crc kubenswrapper[4937]: I0225 15:52:50.386890 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:50 crc kubenswrapper[4937]: I0225 15:52:50.387278 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:50 crc kubenswrapper[4937]: I0225 15:52:50.397617 4937 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]log ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]etcd ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/generic-apiserver-start-informers ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/priority-and-fairness-filter ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/start-apiextensions-informers ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/start-apiextensions-controllers ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/crd-informer-synced ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/start-system-namespaces-controller ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 25 15:52:50 crc kubenswrapper[4937]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/bootstrap-controller ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/start-kube-aggregator-informers ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/apiservice-registration-controller ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/apiservice-discovery-controller ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]autoregister-completion ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/apiservice-openapi-controller ok Feb 25 15:52:50 crc kubenswrapper[4937]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 25 15:52:50 crc kubenswrapper[4937]: livez check failed Feb 25 15:52:50 crc kubenswrapper[4937]: I0225 15:52:50.397685 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 15:52:55 crc kubenswrapper[4937]: I0225 15:52:55.301205 4937 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:55 crc kubenswrapper[4937]: I0225 15:52:55.396558 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:55 crc kubenswrapper[4937]: I0225 15:52:55.401837 4937 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b585585b-975c-4fd7-8599-83ddad968acd" Feb 25 15:52:55 crc kubenswrapper[4937]: I0225 15:52:55.402813 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:52:55 crc kubenswrapper[4937]: I0225 15:52:55.402850 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:52:55 crc kubenswrapper[4937]: I0225 15:52:55.459387 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:52:55 crc kubenswrapper[4937]: I0225 15:52:55.476255 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:52:55 crc kubenswrapper[4937]: I0225 15:52:55.476322 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:52:55 crc kubenswrapper[4937]: I0225 15:52:55.517638 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:52:56 crc kubenswrapper[4937]: I0225 15:52:56.326688 4937 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8ddf09d5-0ab8-4bb9-a321-b1a29590b29f" Feb 25 15:52:56 crc kubenswrapper[4937]: I0225 15:52:56.326725 4937 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8ddf09d5-0ab8-4bb9-a321-b1a29590b29f" Feb 25 15:52:56 crc kubenswrapper[4937]: I0225 15:52:56.376726 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:52:56 crc kubenswrapper[4937]: I0225 15:52:56.382244 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:52:57 crc kubenswrapper[4937]: I0225 15:52:57.338922 4937 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8ddf09d5-0ab8-4bb9-a321-b1a29590b29f" Feb 25 15:52:57 crc kubenswrapper[4937]: I0225 15:52:57.339605 4937 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8ddf09d5-0ab8-4bb9-a321-b1a29590b29f" Feb 25 15:52:57 crc kubenswrapper[4937]: I0225 15:52:57.344564 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:52:57 crc kubenswrapper[4937]: I0225 15:52:57.354594 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:52:57 crc kubenswrapper[4937]: I0225 15:52:57.366652 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:52:57 crc kubenswrapper[4937]: I0225 15:52:57.367044 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 15:52:57 crc kubenswrapper[4937]: I0225 15:52:57.414883 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:52:57 crc kubenswrapper[4937]: I0225 15:52:57.715640 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:52:57 crc kubenswrapper[4937]: I0225 15:52:57.716077 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:52:57 crc kubenswrapper[4937]: I0225 15:52:57.798734 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:52:58 crc kubenswrapper[4937]: I0225 15:52:58.333857 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:52:58 crc kubenswrapper[4937]: I0225 15:52:58.348647 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"156d962ab6307226e313828a6d37f595017ce32e15ad9f99d9fc1ff858639ac5"} Feb 25 15:52:58 crc kubenswrapper[4937]: I0225 15:52:58.348719 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9275d759e494821c8148c705d385b983bf80316021e17e2fde289177187a1ac2"} Feb 25 15:52:58 crc kubenswrapper[4937]: I0225 15:52:58.349268 4937 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8ddf09d5-0ab8-4bb9-a321-b1a29590b29f" Feb 25 15:52:58 crc kubenswrapper[4937]: I0225 15:52:58.349295 4937 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8ddf09d5-0ab8-4bb9-a321-b1a29590b29f" Feb 25 15:52:58 crc kubenswrapper[4937]: I0225 15:52:58.378561 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:52:58 crc kubenswrapper[4937]: I0225 15:52:58.396514 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:52:58 crc kubenswrapper[4937]: I0225 15:52:58.774566 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:52:58 crc kubenswrapper[4937]: I0225 15:52:58.811480 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:53:00 crc kubenswrapper[4937]: I0225 15:53:00.367420 4937 scope.go:117] "RemoveContainer" containerID="5dffc17bc6cc37e12016244f182206403c216c983397f2eecd4bf31a4b5c40d6" Feb 25 15:53:00 crc kubenswrapper[4937]: E0225 15:53:00.367743 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:53:01 crc kubenswrapper[4937]: I0225 15:53:01.453243 4937 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b585585b-975c-4fd7-8599-83ddad968acd" Feb 25 15:53:14 crc kubenswrapper[4937]: I0225 15:53:14.367804 4937 scope.go:117] "RemoveContainer" containerID="5dffc17bc6cc37e12016244f182206403c216c983397f2eecd4bf31a4b5c40d6" Feb 25 15:53:15 crc kubenswrapper[4937]: I0225 15:53:15.466291 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Feb 25 15:53:15 crc kubenswrapper[4937]: I0225 15:53:15.466624 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"517adeb7a2f620b61347781008c4f2ae60bbeda01a7b303dbcb6b967319db187"} Feb 25 15:53:17 crc kubenswrapper[4937]: I0225 15:53:17.479390 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/3.log" Feb 25 15:53:17 crc kubenswrapper[4937]: I0225 15:53:17.480733 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Feb 25 15:53:17 crc kubenswrapper[4937]: I0225 15:53:17.480796 4937 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="517adeb7a2f620b61347781008c4f2ae60bbeda01a7b303dbcb6b967319db187" exitCode=255 Feb 25 15:53:17 crc kubenswrapper[4937]: I0225 15:53:17.480851 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"517adeb7a2f620b61347781008c4f2ae60bbeda01a7b303dbcb6b967319db187"} Feb 25 15:53:17 crc kubenswrapper[4937]: I0225 15:53:17.480945 4937 scope.go:117] "RemoveContainer" containerID="5dffc17bc6cc37e12016244f182206403c216c983397f2eecd4bf31a4b5c40d6" Feb 25 15:53:17 crc kubenswrapper[4937]: I0225 15:53:17.481610 4937 scope.go:117] "RemoveContainer" containerID="517adeb7a2f620b61347781008c4f2ae60bbeda01a7b303dbcb6b967319db187" Feb 25 15:53:17 crc kubenswrapper[4937]: E0225 15:53:17.481932 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:53:18 crc kubenswrapper[4937]: I0225 15:53:18.494966 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/3.log" Feb 25 15:53:24 crc kubenswrapper[4937]: I0225 15:53:24.542027 4937 generic.go:334] "Generic (PLEG): container finished" podID="906509ff-be49-4c28-95b5-9f80cb885ece" containerID="d55d9613d6b79dc417482e2442fd54315d5e7e3d9d531b239fde28469b537c11" exitCode=0 Feb 25 15:53:24 crc kubenswrapper[4937]: I0225 15:53:24.542655 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" event={"ID":"906509ff-be49-4c28-95b5-9f80cb885ece","Type":"ContainerDied","Data":"d55d9613d6b79dc417482e2442fd54315d5e7e3d9d531b239fde28469b537c11"} Feb 25 15:53:24 crc kubenswrapper[4937]: I0225 15:53:24.543479 4937 scope.go:117] "RemoveContainer" containerID="d55d9613d6b79dc417482e2442fd54315d5e7e3d9d531b239fde28469b537c11" Feb 25 15:53:25 crc kubenswrapper[4937]: I0225 15:53:25.024481 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 25 15:53:25 crc kubenswrapper[4937]: I0225 15:53:25.553905 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-r5bpn_906509ff-be49-4c28-95b5-9f80cb885ece/marketplace-operator/1.log" Feb 25 15:53:25 crc kubenswrapper[4937]: I0225 15:53:25.556324 4937 generic.go:334] "Generic (PLEG): container finished" podID="906509ff-be49-4c28-95b5-9f80cb885ece" containerID="3f73bc081f49328b2f1d9a6859c7d975cb70297b8d3bc8d3258a0ee1ee61b32b" exitCode=1 Feb 25 15:53:25 crc kubenswrapper[4937]: I0225 15:53:25.556386 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" event={"ID":"906509ff-be49-4c28-95b5-9f80cb885ece","Type":"ContainerDied","Data":"3f73bc081f49328b2f1d9a6859c7d975cb70297b8d3bc8d3258a0ee1ee61b32b"} Feb 25 15:53:25 crc kubenswrapper[4937]: I0225 15:53:25.556434 4937 scope.go:117] "RemoveContainer" containerID="d55d9613d6b79dc417482e2442fd54315d5e7e3d9d531b239fde28469b537c11" Feb 25 15:53:25 crc kubenswrapper[4937]: I0225 15:53:25.557112 4937 scope.go:117] "RemoveContainer" containerID="3f73bc081f49328b2f1d9a6859c7d975cb70297b8d3bc8d3258a0ee1ee61b32b" Feb 25 15:53:25 crc kubenswrapper[4937]: E0225 15:53:25.557676 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-r5bpn_openshift-marketplace(906509ff-be49-4c28-95b5-9f80cb885ece)\"" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" Feb 25 15:53:26 crc kubenswrapper[4937]: I0225 15:53:26.566127 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-r5bpn_906509ff-be49-4c28-95b5-9f80cb885ece/marketplace-operator/1.log" Feb 25 15:53:27 crc kubenswrapper[4937]: I0225 15:53:27.311524 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:53:27 crc kubenswrapper[4937]: I0225 15:53:27.311613 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:53:27 crc kubenswrapper[4937]: I0225 15:53:27.312223 4937 scope.go:117] "RemoveContainer" containerID="3f73bc081f49328b2f1d9a6859c7d975cb70297b8d3bc8d3258a0ee1ee61b32b" Feb 25 15:53:27 crc kubenswrapper[4937]: E0225 15:53:27.312632 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-r5bpn_openshift-marketplace(906509ff-be49-4c28-95b5-9f80cb885ece)\"" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" Feb 25 15:53:27 crc kubenswrapper[4937]: I0225 15:53:27.475471 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 15:53:27 crc kubenswrapper[4937]: I0225 15:53:27.574077 4937 scope.go:117] "RemoveContainer" containerID="3f73bc081f49328b2f1d9a6859c7d975cb70297b8d3bc8d3258a0ee1ee61b32b" Feb 25 15:53:27 crc kubenswrapper[4937]: E0225 15:53:27.574520 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-r5bpn_openshift-marketplace(906509ff-be49-4c28-95b5-9f80cb885ece)\"" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" Feb 25 15:53:29 crc kubenswrapper[4937]: I0225 15:53:29.368271 4937 scope.go:117] "RemoveContainer" containerID="517adeb7a2f620b61347781008c4f2ae60bbeda01a7b303dbcb6b967319db187" Feb 25 15:53:29 crc kubenswrapper[4937]: E0225 15:53:29.368671 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:53:31 crc kubenswrapper[4937]: I0225 15:53:31.405773 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 25 15:53:33 crc kubenswrapper[4937]: I0225 15:53:33.615193 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 25 15:53:34 crc kubenswrapper[4937]: I0225 15:53:34.984733 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 15:53:37 crc kubenswrapper[4937]: I0225 15:53:37.175280 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 25 15:53:39 crc kubenswrapper[4937]: I0225 15:53:39.148202 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 25 15:53:40 crc kubenswrapper[4937]: I0225 15:53:40.368701 4937 scope.go:117] "RemoveContainer" containerID="517adeb7a2f620b61347781008c4f2ae60bbeda01a7b303dbcb6b967319db187" Feb 25 15:53:40 crc kubenswrapper[4937]: E0225 15:53:40.369134 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:53:40 crc kubenswrapper[4937]: I0225 15:53:40.369168 4937 scope.go:117] "RemoveContainer" containerID="3f73bc081f49328b2f1d9a6859c7d975cb70297b8d3bc8d3258a0ee1ee61b32b" Feb 25 15:53:40 crc kubenswrapper[4937]: I0225 15:53:40.720410 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 25 15:53:41 crc kubenswrapper[4937]: I0225 15:53:41.495415 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 15:53:41 crc kubenswrapper[4937]: I0225 15:53:41.496269 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 15:53:41 crc kubenswrapper[4937]: I0225 15:53:41.924161 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 25 15:53:42 crc kubenswrapper[4937]: I0225 15:53:42.680182 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-r5bpn_906509ff-be49-4c28-95b5-9f80cb885ece/marketplace-operator/1.log" Feb 25 15:53:42 crc kubenswrapper[4937]: I0225 15:53:42.680286 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" event={"ID":"906509ff-be49-4c28-95b5-9f80cb885ece","Type":"ContainerStarted","Data":"74b7b25a4e53b9ba1dc21b5169e1b8a1dd55cbb38c6b31e7b8e2d6cb94af884a"} Feb 25 15:53:42 crc kubenswrapper[4937]: I0225 15:53:42.680742 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:53:42 crc kubenswrapper[4937]: I0225 15:53:42.682445 4937 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r5bpn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Feb 25 15:53:42 crc kubenswrapper[4937]: I0225 15:53:42.682707 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Feb 25 15:53:42 crc kubenswrapper[4937]: I0225 15:53:42.849110 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 15:53:43 crc kubenswrapper[4937]: I0225 15:53:43.005441 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 25 15:53:43 crc kubenswrapper[4937]: I0225 15:53:43.036677 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 25 15:53:43 crc kubenswrapper[4937]: I0225 15:53:43.595504 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 25 15:53:43 crc kubenswrapper[4937]: I0225 15:53:43.693949 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-r5bpn_906509ff-be49-4c28-95b5-9f80cb885ece/marketplace-operator/2.log" Feb 25 15:53:43 crc kubenswrapper[4937]: I0225 15:53:43.694954 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-r5bpn_906509ff-be49-4c28-95b5-9f80cb885ece/marketplace-operator/1.log" Feb 25 15:53:43 crc kubenswrapper[4937]: I0225 15:53:43.694984 4937 generic.go:334] "Generic (PLEG): container finished" podID="906509ff-be49-4c28-95b5-9f80cb885ece" containerID="74b7b25a4e53b9ba1dc21b5169e1b8a1dd55cbb38c6b31e7b8e2d6cb94af884a" exitCode=1 Feb 25 15:53:43 crc kubenswrapper[4937]: I0225 15:53:43.695009 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" event={"ID":"906509ff-be49-4c28-95b5-9f80cb885ece","Type":"ContainerDied","Data":"74b7b25a4e53b9ba1dc21b5169e1b8a1dd55cbb38c6b31e7b8e2d6cb94af884a"} Feb 25 15:53:43 crc kubenswrapper[4937]: I0225 15:53:43.695039 4937 scope.go:117] "RemoveContainer" containerID="3f73bc081f49328b2f1d9a6859c7d975cb70297b8d3bc8d3258a0ee1ee61b32b" Feb 25 15:53:43 crc kubenswrapper[4937]: I0225 15:53:43.695452 4937 scope.go:117] "RemoveContainer" containerID="74b7b25a4e53b9ba1dc21b5169e1b8a1dd55cbb38c6b31e7b8e2d6cb94af884a" Feb 25 15:53:43 crc kubenswrapper[4937]: E0225 15:53:43.695625 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-r5bpn_openshift-marketplace(906509ff-be49-4c28-95b5-9f80cb885ece)\"" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" Feb 25 15:53:44 crc kubenswrapper[4937]: I0225 15:53:44.279135 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 25 15:53:44 crc kubenswrapper[4937]: I0225 15:53:44.421338 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 25 15:53:44 crc kubenswrapper[4937]: I0225 15:53:44.705864 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-r5bpn_906509ff-be49-4c28-95b5-9f80cb885ece/marketplace-operator/2.log" Feb 25 15:53:44 crc kubenswrapper[4937]: I0225 15:53:44.707703 4937 scope.go:117] "RemoveContainer" containerID="74b7b25a4e53b9ba1dc21b5169e1b8a1dd55cbb38c6b31e7b8e2d6cb94af884a" Feb 25 15:53:44 crc kubenswrapper[4937]: E0225 15:53:44.708010 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-r5bpn_openshift-marketplace(906509ff-be49-4c28-95b5-9f80cb885ece)\"" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" Feb 25 15:53:44 crc kubenswrapper[4937]: I0225 15:53:44.788720 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 25 15:53:45 crc kubenswrapper[4937]: I0225 15:53:45.378796 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 25 15:53:45 crc kubenswrapper[4937]: I0225 15:53:45.569203 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 25 15:53:45 crc kubenswrapper[4937]: I0225 15:53:45.682668 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 25 15:53:45 crc kubenswrapper[4937]: I0225 15:53:45.836294 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 25 15:53:46 crc kubenswrapper[4937]: I0225 15:53:46.052638 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 25 15:53:46 crc kubenswrapper[4937]: I0225 15:53:46.120939 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 25 15:53:46 crc kubenswrapper[4937]: I0225 15:53:46.295411 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 25 15:53:46 crc kubenswrapper[4937]: I0225 15:53:46.397545 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 25 15:53:46 crc kubenswrapper[4937]: I0225 15:53:46.915471 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 25 15:53:47 crc kubenswrapper[4937]: I0225 15:53:47.018271 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 25 15:53:47 crc kubenswrapper[4937]: I0225 15:53:47.145303 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 25 15:53:47 crc kubenswrapper[4937]: I0225 15:53:47.222194 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 25 15:53:47 crc kubenswrapper[4937]: I0225 15:53:47.279169 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 25 15:53:47 crc kubenswrapper[4937]: I0225 15:53:47.311097 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:53:47 crc kubenswrapper[4937]: I0225 15:53:47.311682 4937 scope.go:117] "RemoveContainer" containerID="74b7b25a4e53b9ba1dc21b5169e1b8a1dd55cbb38c6b31e7b8e2d6cb94af884a" Feb 25 15:53:47 crc kubenswrapper[4937]: E0225 15:53:47.311904 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-r5bpn_openshift-marketplace(906509ff-be49-4c28-95b5-9f80cb885ece)\"" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" Feb 25 15:53:47 crc kubenswrapper[4937]: I0225 15:53:47.552776 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 25 15:53:47 crc kubenswrapper[4937]: I0225 15:53:47.555672 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 25 15:53:47 crc kubenswrapper[4937]: I0225 15:53:47.664390 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 25 15:53:48 crc kubenswrapper[4937]: I0225 15:53:48.222310 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 25 15:53:48 crc kubenswrapper[4937]: I0225 15:53:48.424621 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 25 15:53:48 crc kubenswrapper[4937]: I0225 15:53:48.518151 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 25 15:53:48 crc kubenswrapper[4937]: I0225 15:53:48.593537 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 25 15:53:48 crc kubenswrapper[4937]: I0225 15:53:48.839241 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 25 15:53:48 crc kubenswrapper[4937]: I0225 15:53:48.933756 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 25 15:53:49 crc kubenswrapper[4937]: I0225 15:53:49.188452 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 25 15:53:49 crc kubenswrapper[4937]: I0225 15:53:49.739037 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-56656f9798-sbntq_b771f4d8-8253-4530-9e1a-e0ca06f263e4/machine-approver-controller/0.log" Feb 25 15:53:49 crc kubenswrapper[4937]: I0225 15:53:49.739725 4937 generic.go:334] "Generic (PLEG): container finished" podID="b771f4d8-8253-4530-9e1a-e0ca06f263e4" containerID="096fd1ba256aac0190d0c8386c67d9fcdbf3dd3fd696df3f74a8483ea439fa94" exitCode=255 Feb 25 15:53:49 crc kubenswrapper[4937]: I0225 15:53:49.739785 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" event={"ID":"b771f4d8-8253-4530-9e1a-e0ca06f263e4","Type":"ContainerDied","Data":"096fd1ba256aac0190d0c8386c67d9fcdbf3dd3fd696df3f74a8483ea439fa94"} Feb 25 15:53:49 crc kubenswrapper[4937]: I0225 15:53:49.740521 4937 scope.go:117] "RemoveContainer" containerID="096fd1ba256aac0190d0c8386c67d9fcdbf3dd3fd696df3f74a8483ea439fa94" Feb 25 15:53:50 crc kubenswrapper[4937]: I0225 15:53:50.688073 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 25 15:53:50 crc kubenswrapper[4937]: I0225 15:53:50.747104 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dz785_9b1dc13b-9b02-42b0-a00e-21f15f9f98a2/control-plane-machine-set-operator/0.log" Feb 25 15:53:50 crc kubenswrapper[4937]: I0225 15:53:50.747154 4937 generic.go:334] "Generic (PLEG): container finished" podID="9b1dc13b-9b02-42b0-a00e-21f15f9f98a2" containerID="15083cf85fd2d5656ed8d2f192c65a6a2556f18991b2db9af4b8124c531c08a8" exitCode=1 Feb 25 15:53:50 crc kubenswrapper[4937]: I0225 15:53:50.747250 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785" event={"ID":"9b1dc13b-9b02-42b0-a00e-21f15f9f98a2","Type":"ContainerDied","Data":"15083cf85fd2d5656ed8d2f192c65a6a2556f18991b2db9af4b8124c531c08a8"} Feb 25 15:53:50 crc kubenswrapper[4937]: I0225 15:53:50.748152 4937 scope.go:117] "RemoveContainer" containerID="15083cf85fd2d5656ed8d2f192c65a6a2556f18991b2db9af4b8124c531c08a8" Feb 25 15:53:50 crc kubenswrapper[4937]: I0225 15:53:50.749514 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-56656f9798-sbntq_b771f4d8-8253-4530-9e1a-e0ca06f263e4/machine-approver-controller/0.log" Feb 25 15:53:50 crc kubenswrapper[4937]: I0225 15:53:50.749998 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sbntq" event={"ID":"b771f4d8-8253-4530-9e1a-e0ca06f263e4","Type":"ContainerStarted","Data":"827a92ccca76d195604b3ceffa9b4b988ab2745a260c04337502d615d76f3c97"} Feb 25 15:53:50 crc kubenswrapper[4937]: I0225 15:53:50.809650 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 25 15:53:51 crc kubenswrapper[4937]: I0225 15:53:51.088987 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 25 15:53:51 crc kubenswrapper[4937]: I0225 15:53:51.394079 4937 scope.go:117] "RemoveContainer" containerID="517adeb7a2f620b61347781008c4f2ae60bbeda01a7b303dbcb6b967319db187" Feb 25 15:53:51 crc kubenswrapper[4937]: E0225 15:53:51.394554 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 15:53:51 crc kubenswrapper[4937]: I0225 15:53:51.446370 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 25 15:53:51 crc kubenswrapper[4937]: I0225 15:53:51.557622 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 25 15:53:51 crc kubenswrapper[4937]: I0225 15:53:51.759635 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dz785_9b1dc13b-9b02-42b0-a00e-21f15f9f98a2/control-plane-machine-set-operator/0.log" Feb 25 15:53:51 crc kubenswrapper[4937]: I0225 15:53:51.759733 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dz785" event={"ID":"9b1dc13b-9b02-42b0-a00e-21f15f9f98a2","Type":"ContainerStarted","Data":"931153990b82bdf3a4f8d37603198e30a900b0c4a1cdc67df91c02c149bb2914"} Feb 25 15:53:51 crc kubenswrapper[4937]: I0225 15:53:51.867742 4937 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 25 15:53:51 crc kubenswrapper[4937]: I0225 15:53:51.956443 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 25 15:53:52 crc kubenswrapper[4937]: I0225 15:53:52.029864 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 25 15:53:52 crc kubenswrapper[4937]: I0225 15:53:52.471240 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 25 15:53:52 crc kubenswrapper[4937]: I0225 15:53:52.472963 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 15:53:52 crc kubenswrapper[4937]: I0225 15:53:52.484956 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 25 15:53:52 crc kubenswrapper[4937]: I0225 15:53:52.604609 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 25 15:53:52 crc kubenswrapper[4937]: I0225 15:53:52.809347 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 25 15:53:52 crc kubenswrapper[4937]: I0225 15:53:52.908835 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 25 15:53:53 crc kubenswrapper[4937]: I0225 15:53:53.354360 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 25 15:53:53 crc kubenswrapper[4937]: I0225 15:53:53.398558 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 25 15:53:53 crc kubenswrapper[4937]: I0225 15:53:53.653573 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 25 15:53:53 crc kubenswrapper[4937]: I0225 15:53:53.813029 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 25 15:53:53 crc kubenswrapper[4937]: I0225 15:53:53.934035 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 25 15:53:54 crc kubenswrapper[4937]: I0225 15:53:54.010577 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 25 15:53:54 crc kubenswrapper[4937]: I0225 15:53:54.104740 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 25 15:53:54 crc kubenswrapper[4937]: I0225 15:53:54.302157 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 25 15:53:54 crc kubenswrapper[4937]: I0225 15:53:54.514313 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 25 15:53:54 crc kubenswrapper[4937]: I0225 15:53:54.626647 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 25 15:53:54 crc kubenswrapper[4937]: I0225 15:53:54.636446 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 25 15:53:54 crc kubenswrapper[4937]: I0225 15:53:54.800024 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 25 15:53:55 crc kubenswrapper[4937]: I0225 15:53:55.057325 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 25 15:53:55 crc kubenswrapper[4937]: I0225 15:53:55.518166 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 25 15:53:55 crc kubenswrapper[4937]: I0225 15:53:55.719442 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 25 15:53:55 crc kubenswrapper[4937]: I0225 15:53:55.780408 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 25 15:53:55 crc kubenswrapper[4937]: I0225 15:53:55.936649 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 25 15:53:56 crc kubenswrapper[4937]: I0225 15:53:56.987610 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.013289 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.367828 4937 scope.go:117] "RemoveContainer" containerID="74b7b25a4e53b9ba1dc21b5169e1b8a1dd55cbb38c6b31e7b8e2d6cb94af884a" Feb 25 15:53:57 crc kubenswrapper[4937]: E0225 15:53:57.368126 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-r5bpn_openshift-marketplace(906509ff-be49-4c28-95b5-9f80cb885ece)\"" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.688804 4937 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.689449 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-54sqd" podStartSLOduration=76.135914841 podStartE2EDuration="4m3.689419241s" podCreationTimestamp="2026-02-25 15:49:54 +0000 UTC" firstStartedPulling="2026-02-25 15:49:56.672890857 +0000 UTC m=+247.686282747" lastFinishedPulling="2026-02-25 15:52:44.226395227 +0000 UTC m=+415.239787147" observedRunningTime="2026-02-25 15:52:54.176718717 +0000 UTC m=+425.190110597" watchObservedRunningTime="2026-02-25 15:53:57.689419241 +0000 UTC m=+488.702811171" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.690901 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-scndr" podStartSLOduration=91.520123342 podStartE2EDuration="4m0.690889588s" podCreationTimestamp="2026-02-25 15:49:57 +0000 UTC" firstStartedPulling="2026-02-25 15:50:06.955069767 +0000 UTC m=+257.968461657" lastFinishedPulling="2026-02-25 15:52:36.125836023 +0000 UTC m=+407.139227903" observedRunningTime="2026-02-25 15:52:54.233413584 +0000 UTC m=+425.246805474" watchObservedRunningTime="2026-02-25 15:53:57.690889588 +0000 UTC m=+488.704281518" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.694839 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l8xkp" podStartSLOduration=76.15270246 podStartE2EDuration="4m3.694826237s" podCreationTimestamp="2026-02-25 15:49:54 +0000 UTC" firstStartedPulling="2026-02-25 15:49:56.688072555 +0000 UTC m=+247.701464445" lastFinishedPulling="2026-02-25 15:52:44.230196302 +0000 UTC m=+415.243588222" observedRunningTime="2026-02-25 15:52:54.328342333 +0000 UTC m=+425.341734233" watchObservedRunningTime="2026-02-25 15:53:57.694826237 +0000 UTC m=+488.708218167" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.695629 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mrgxt" podStartSLOduration=76.296191199 podStartE2EDuration="4m0.695617217s" podCreationTimestamp="2026-02-25 15:49:57 +0000 UTC" firstStartedPulling="2026-02-25 15:50:03.76911979 +0000 UTC m=+254.782511680" lastFinishedPulling="2026-02-25 15:52:48.168545818 +0000 UTC m=+419.181937698" observedRunningTime="2026-02-25 15:52:54.410017769 +0000 UTC m=+425.423409669" watchObservedRunningTime="2026-02-25 15:53:57.695617217 +0000 UTC m=+488.709009147" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.695789 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l9tlm" podStartSLOduration=82.417840649 podStartE2EDuration="3m59.695783451s" podCreationTimestamp="2026-02-25 15:49:58 +0000 UTC" firstStartedPulling="2026-02-25 15:50:06.94984082 +0000 UTC m=+257.963232710" lastFinishedPulling="2026-02-25 15:52:44.227783582 +0000 UTC m=+415.241175512" observedRunningTime="2026-02-25 15:52:54.108834159 +0000 UTC m=+425.122226059" watchObservedRunningTime="2026-02-25 15:53:57.695783451 +0000 UTC m=+488.709175391" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.697762 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gsxxs" podStartSLOduration=91.013572994 podStartE2EDuration="4m1.697746761s" podCreationTimestamp="2026-02-25 15:49:56 +0000 UTC" firstStartedPulling="2026-02-25 15:50:03.769238943 +0000 UTC m=+254.782630843" lastFinishedPulling="2026-02-25 15:52:34.45341269 +0000 UTC m=+405.466804610" observedRunningTime="2026-02-25 15:52:54.366107594 +0000 UTC m=+425.379499514" watchObservedRunningTime="2026-02-25 15:53:57.697746761 +0000 UTC m=+488.711138691" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.698414 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.698478 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-infra/auto-csr-approver-29533912-7rslj"] Feb 25 15:53:57 crc kubenswrapper[4937]: E0225 15:53:57.698943 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" containerName="installer" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.698987 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" containerName="installer" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.699028 4937 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8ddf09d5-0ab8-4bb9-a321-b1a29590b29f" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.699089 4937 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8ddf09d5-0ab8-4bb9-a321-b1a29590b29f" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.699221 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8624c64-7d59-4173-8928-e7dff50f1039" containerName="installer" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.700126 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533912-7rslj" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.704976 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.705409 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.705781 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.709559 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.733337 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.735791 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=62.735764158 podStartE2EDuration="1m2.735764158s" podCreationTimestamp="2026-02-25 15:52:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:53:57.729639224 +0000 UTC m=+488.743031134" watchObservedRunningTime="2026-02-25 15:53:57.735764158 +0000 UTC m=+488.749156088" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.750364 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czg5v\" (UniqueName: \"kubernetes.io/projected/6cf00929-ff8b-42c5-96f8-6f02e52372ae-kube-api-access-czg5v\") pod \"auto-csr-approver-29533912-7rslj\" (UID: \"6cf00929-ff8b-42c5-96f8-6f02e52372ae\") " pod="openshift-infra/auto-csr-approver-29533912-7rslj" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.752709 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.776744 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=26.776713398 podStartE2EDuration="26.776713398s" podCreationTimestamp="2026-02-25 15:53:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:53:57.765581538 +0000 UTC m=+488.778973448" watchObservedRunningTime="2026-02-25 15:53:57.776713398 +0000 UTC m=+488.790105308" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.781810 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.851311 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czg5v\" (UniqueName: \"kubernetes.io/projected/6cf00929-ff8b-42c5-96f8-6f02e52372ae-kube-api-access-czg5v\") pod \"auto-csr-approver-29533912-7rslj\" (UID: \"6cf00929-ff8b-42c5-96f8-6f02e52372ae\") " pod="openshift-infra/auto-csr-approver-29533912-7rslj" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.869939 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czg5v\" (UniqueName: \"kubernetes.io/projected/6cf00929-ff8b-42c5-96f8-6f02e52372ae-kube-api-access-czg5v\") pod \"auto-csr-approver-29533912-7rslj\" (UID: \"6cf00929-ff8b-42c5-96f8-6f02e52372ae\") " pod="openshift-infra/auto-csr-approver-29533912-7rslj" Feb 25 15:53:57 crc kubenswrapper[4937]: I0225 15:53:57.888043 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 25 15:53:58 crc kubenswrapper[4937]: I0225 15:53:58.027394 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533912-7rslj" Feb 25 15:53:58 crc kubenswrapper[4937]: I0225 15:53:58.079973 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 25 15:53:58 crc kubenswrapper[4937]: I0225 15:53:58.648079 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 25 15:53:58 crc kubenswrapper[4937]: I0225 15:53:58.892824 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 25 15:53:59 crc kubenswrapper[4937]: I0225 15:53:59.136812 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 25 15:53:59 crc kubenswrapper[4937]: I0225 15:53:59.223316 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 25 15:53:59 crc kubenswrapper[4937]: I0225 15:53:59.270803 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 25 15:53:59 crc kubenswrapper[4937]: I0225 15:53:59.277852 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 25 15:53:59 crc kubenswrapper[4937]: I0225 15:53:59.484580 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 25 15:53:59 crc kubenswrapper[4937]: I0225 15:53:59.528988 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 25 15:53:59 crc kubenswrapper[4937]: I0225 15:53:59.574816 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 25 15:53:59 crc kubenswrapper[4937]: I0225 15:53:59.576285 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 25 15:53:59 crc kubenswrapper[4937]: I0225 15:53:59.639617 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 25 15:53:59 crc kubenswrapper[4937]: I0225 15:53:59.642203 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 25 15:53:59 crc kubenswrapper[4937]: I0225 15:53:59.680311 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 25 15:53:59 crc kubenswrapper[4937]: I0225 15:53:59.829184 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 25 15:53:59 crc kubenswrapper[4937]: I0225 15:53:59.897845 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 25 15:53:59 crc kubenswrapper[4937]: I0225 15:53:59.966205 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.084927 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.138220 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533914-nfxdw"] Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.138973 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533914-nfxdw" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.184524 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qzxf\" (UniqueName: \"kubernetes.io/projected/e90b54e9-9441-4afc-a28d-6febea150a04-kube-api-access-2qzxf\") pod \"auto-csr-approver-29533914-nfxdw\" (UID: \"e90b54e9-9441-4afc-a28d-6febea150a04\") " pod="openshift-infra/auto-csr-approver-29533914-nfxdw" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.285952 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qzxf\" (UniqueName: \"kubernetes.io/projected/e90b54e9-9441-4afc-a28d-6febea150a04-kube-api-access-2qzxf\") pod \"auto-csr-approver-29533914-nfxdw\" (UID: \"e90b54e9-9441-4afc-a28d-6febea150a04\") " pod="openshift-infra/auto-csr-approver-29533914-nfxdw" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.315907 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qzxf\" (UniqueName: \"kubernetes.io/projected/e90b54e9-9441-4afc-a28d-6febea150a04-kube-api-access-2qzxf\") pod \"auto-csr-approver-29533914-nfxdw\" (UID: \"e90b54e9-9441-4afc-a28d-6febea150a04\") " pod="openshift-infra/auto-csr-approver-29533914-nfxdw" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.459596 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.463940 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533914-nfxdw" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.507149 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.513557 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.527339 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.551819 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.554270 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.635748 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.640712 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.699801 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 15:54:00 crc kubenswrapper[4937]: I0225 15:54:00.937604 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 25 15:54:01 crc kubenswrapper[4937]: I0225 15:54:01.164295 4937 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 25 15:54:01 crc kubenswrapper[4937]: I0225 15:54:01.494353 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 25 15:54:01 crc kubenswrapper[4937]: I0225 15:54:01.568411 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 25 15:54:01 crc kubenswrapper[4937]: I0225 15:54:01.745308 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 25 15:54:01 crc kubenswrapper[4937]: I0225 15:54:01.775292 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 25 15:54:01 crc kubenswrapper[4937]: I0225 15:54:01.811469 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 25 15:54:01 crc kubenswrapper[4937]: I0225 15:54:01.888645 4937 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 25 15:54:01 crc kubenswrapper[4937]: I0225 15:54:01.935067 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.051771 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.073839 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.157065 4937 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.157301 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2f59940a441b8bf7c213dadb16cfdacb7fdd87ff4b235b1a6339a8b538eed79e" gracePeriod=5 Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.259417 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.365996 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.403813 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.408507 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.469818 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.752195 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.849104 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.851562 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.955954 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.987905 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 25 15:54:02 crc kubenswrapper[4937]: I0225 15:54:02.990402 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 25 15:54:03 crc kubenswrapper[4937]: I0225 15:54:03.007537 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 25 15:54:03 crc kubenswrapper[4937]: I0225 15:54:03.247526 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 25 15:54:03 crc kubenswrapper[4937]: I0225 15:54:03.285323 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 25 15:54:03 crc kubenswrapper[4937]: I0225 15:54:03.515635 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 25 15:54:03 crc kubenswrapper[4937]: I0225 15:54:03.750986 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 25 15:54:03 crc kubenswrapper[4937]: I0225 15:54:03.759189 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 25 15:54:04 crc kubenswrapper[4937]: I0225 15:54:04.186420 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 15:54:04 crc kubenswrapper[4937]: I0225 15:54:04.367740 4937 scope.go:117] "RemoveContainer" containerID="517adeb7a2f620b61347781008c4f2ae60bbeda01a7b303dbcb6b967319db187" Feb 25 15:54:04 crc kubenswrapper[4937]: I0225 15:54:04.482184 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 25 15:54:04 crc kubenswrapper[4937]: I0225 15:54:04.627072 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 25 15:54:04 crc kubenswrapper[4937]: I0225 15:54:04.762060 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 25 15:54:04 crc kubenswrapper[4937]: I0225 15:54:04.806160 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 25 15:54:04 crc kubenswrapper[4937]: I0225 15:54:04.836794 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 25 15:54:04 crc kubenswrapper[4937]: I0225 15:54:04.869079 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/3.log" Feb 25 15:54:04 crc kubenswrapper[4937]: I0225 15:54:04.869170 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"815c00eeef1ee04bfacf6cc9f5e24038b4f6103f8e80c579c33302ac32b5e571"} Feb 25 15:54:04 crc kubenswrapper[4937]: I0225 15:54:04.960754 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 25 15:54:05 crc kubenswrapper[4937]: I0225 15:54:05.129314 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 15:54:05 crc kubenswrapper[4937]: I0225 15:54:05.261386 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 25 15:54:05 crc kubenswrapper[4937]: I0225 15:54:05.268959 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 25 15:54:05 crc kubenswrapper[4937]: I0225 15:54:05.488266 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 25 15:54:05 crc kubenswrapper[4937]: I0225 15:54:05.576452 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 25 15:54:05 crc kubenswrapper[4937]: I0225 15:54:05.620212 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 25 15:54:06 crc kubenswrapper[4937]: I0225 15:54:06.159881 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 25 15:54:06 crc kubenswrapper[4937]: I0225 15:54:06.411568 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 25 15:54:06 crc kubenswrapper[4937]: I0225 15:54:06.607373 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 25 15:54:06 crc kubenswrapper[4937]: I0225 15:54:06.862594 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 15:54:06 crc kubenswrapper[4937]: I0225 15:54:06.915262 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.077351 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.447473 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.474560 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.631285 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.757715 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.757779 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.787056 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.893919 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.894273 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.894327 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.894365 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.894369 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.894387 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.894443 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.894443 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.894502 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.894556 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.894953 4937 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.894983 4937 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.895007 4937 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.895029 4937 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.897210 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.897269 4937 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2f59940a441b8bf7c213dadb16cfdacb7fdd87ff4b235b1a6339a8b538eed79e" exitCode=137 Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.897320 4937 scope.go:117] "RemoveContainer" containerID="2f59940a441b8bf7c213dadb16cfdacb7fdd87ff4b235b1a6339a8b538eed79e" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.897387 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.905010 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.933692 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.962288 4937 scope.go:117] "RemoveContainer" containerID="2f59940a441b8bf7c213dadb16cfdacb7fdd87ff4b235b1a6339a8b538eed79e" Feb 25 15:54:07 crc kubenswrapper[4937]: E0225 15:54:07.962816 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f59940a441b8bf7c213dadb16cfdacb7fdd87ff4b235b1a6339a8b538eed79e\": container with ID starting with 2f59940a441b8bf7c213dadb16cfdacb7fdd87ff4b235b1a6339a8b538eed79e not found: ID does not exist" containerID="2f59940a441b8bf7c213dadb16cfdacb7fdd87ff4b235b1a6339a8b538eed79e" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.962855 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f59940a441b8bf7c213dadb16cfdacb7fdd87ff4b235b1a6339a8b538eed79e"} err="failed to get container status \"2f59940a441b8bf7c213dadb16cfdacb7fdd87ff4b235b1a6339a8b538eed79e\": rpc error: code = NotFound desc = could not find container \"2f59940a441b8bf7c213dadb16cfdacb7fdd87ff4b235b1a6339a8b538eed79e\": container with ID starting with 2f59940a441b8bf7c213dadb16cfdacb7fdd87ff4b235b1a6339a8b538eed79e not found: ID does not exist" Feb 25 15:54:07 crc kubenswrapper[4937]: I0225 15:54:07.996212 4937 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:08 crc kubenswrapper[4937]: I0225 15:54:08.025049 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 25 15:54:08 crc kubenswrapper[4937]: I0225 15:54:08.059912 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 25 15:54:08 crc kubenswrapper[4937]: I0225 15:54:08.119621 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 25 15:54:08 crc kubenswrapper[4937]: I0225 15:54:08.171883 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 25 15:54:08 crc kubenswrapper[4937]: I0225 15:54:08.176771 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 25 15:54:08 crc kubenswrapper[4937]: I0225 15:54:08.480440 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 25 15:54:08 crc kubenswrapper[4937]: I0225 15:54:08.655889 4937 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 25 15:54:08 crc kubenswrapper[4937]: I0225 15:54:08.675927 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 25 15:54:08 crc kubenswrapper[4937]: I0225 15:54:08.717515 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 25 15:54:08 crc kubenswrapper[4937]: I0225 15:54:08.818452 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.015158 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.236823 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.330009 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.368273 4937 scope.go:117] "RemoveContainer" containerID="74b7b25a4e53b9ba1dc21b5169e1b8a1dd55cbb38c6b31e7b8e2d6cb94af884a" Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.382168 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.382404 4937 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.394433 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.394468 4937 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="6c3e935d-42d7-4cf5-bd0e-380a7fe3065e" Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.401565 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.401611 4937 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="6c3e935d-42d7-4cf5-bd0e-380a7fe3065e" Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.421405 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.573551 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.697142 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.921301 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-r5bpn_906509ff-be49-4c28-95b5-9f80cb885ece/marketplace-operator/2.log" Feb 25 15:54:09 crc kubenswrapper[4937]: I0225 15:54:09.959468 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 25 15:54:10 crc kubenswrapper[4937]: I0225 15:54:10.127203 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 15:54:10 crc kubenswrapper[4937]: I0225 15:54:10.156446 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 25 15:54:10 crc kubenswrapper[4937]: I0225 15:54:10.512664 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 25 15:54:10 crc kubenswrapper[4937]: I0225 15:54:10.616413 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 25 15:54:10 crc kubenswrapper[4937]: I0225 15:54:10.735321 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 25 15:54:10 crc kubenswrapper[4937]: I0225 15:54:10.860527 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 25 15:54:10 crc kubenswrapper[4937]: I0225 15:54:10.892743 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 25 15:54:10 crc kubenswrapper[4937]: I0225 15:54:10.929147 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-r5bpn_906509ff-be49-4c28-95b5-9f80cb885ece/marketplace-operator/2.log" Feb 25 15:54:10 crc kubenswrapper[4937]: I0225 15:54:10.929226 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" event={"ID":"906509ff-be49-4c28-95b5-9f80cb885ece","Type":"ContainerStarted","Data":"73b9753cfdf2d17dad772e595e65de687ed31b8d429b43e2af19002994219da0"} Feb 25 15:54:10 crc kubenswrapper[4937]: I0225 15:54:10.929767 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:54:10 crc kubenswrapper[4937]: I0225 15:54:10.933321 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:54:11 crc kubenswrapper[4937]: I0225 15:54:11.090774 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 25 15:54:11 crc kubenswrapper[4937]: I0225 15:54:11.243973 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 25 15:54:11 crc kubenswrapper[4937]: I0225 15:54:11.424182 4937 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 25 15:54:11 crc kubenswrapper[4937]: I0225 15:54:11.450973 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 25 15:54:11 crc kubenswrapper[4937]: I0225 15:54:11.495325 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 15:54:11 crc kubenswrapper[4937]: I0225 15:54:11.495395 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 15:54:11 crc kubenswrapper[4937]: I0225 15:54:11.503569 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 25 15:54:11 crc kubenswrapper[4937]: I0225 15:54:11.625009 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 25 15:54:11 crc kubenswrapper[4937]: I0225 15:54:11.795871 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 25 15:54:11 crc kubenswrapper[4937]: I0225 15:54:11.991085 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 25 15:54:12 crc kubenswrapper[4937]: I0225 15:54:12.003089 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 25 15:54:12 crc kubenswrapper[4937]: I0225 15:54:12.009554 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 25 15:54:12 crc kubenswrapper[4937]: I0225 15:54:12.047084 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 25 15:54:12 crc kubenswrapper[4937]: I0225 15:54:12.119249 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 25 15:54:12 crc kubenswrapper[4937]: I0225 15:54:12.257003 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 25 15:54:12 crc kubenswrapper[4937]: I0225 15:54:12.481228 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 25 15:54:12 crc kubenswrapper[4937]: I0225 15:54:12.566467 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 25 15:54:12 crc kubenswrapper[4937]: I0225 15:54:12.608534 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 25 15:54:12 crc kubenswrapper[4937]: I0225 15:54:12.624610 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 25 15:54:12 crc kubenswrapper[4937]: I0225 15:54:12.639663 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 25 15:54:12 crc kubenswrapper[4937]: I0225 15:54:12.861457 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 25 15:54:12 crc kubenswrapper[4937]: I0225 15:54:12.933144 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 25 15:54:13 crc kubenswrapper[4937]: I0225 15:54:13.203878 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 25 15:54:13 crc kubenswrapper[4937]: I0225 15:54:13.437002 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 25 15:54:13 crc kubenswrapper[4937]: I0225 15:54:13.549435 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 25 15:54:13 crc kubenswrapper[4937]: I0225 15:54:13.684381 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 25 15:54:14 crc kubenswrapper[4937]: I0225 15:54:14.534182 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 25 15:54:14 crc kubenswrapper[4937]: I0225 15:54:14.652021 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 25 15:54:14 crc kubenswrapper[4937]: I0225 15:54:14.759320 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 25 15:54:15 crc kubenswrapper[4937]: I0225 15:54:15.050335 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 25 15:54:15 crc kubenswrapper[4937]: I0225 15:54:15.438566 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 25 15:54:15 crc kubenswrapper[4937]: I0225 15:54:15.608150 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 25 15:54:16 crc kubenswrapper[4937]: I0225 15:54:16.004625 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 25 15:54:16 crc kubenswrapper[4937]: I0225 15:54:16.019552 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 25 15:54:16 crc kubenswrapper[4937]: I0225 15:54:16.334966 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 25 15:54:16 crc kubenswrapper[4937]: I0225 15:54:16.547772 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 25 15:54:16 crc kubenswrapper[4937]: I0225 15:54:16.574662 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 25 15:54:16 crc kubenswrapper[4937]: I0225 15:54:16.774750 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 25 15:54:16 crc kubenswrapper[4937]: I0225 15:54:16.871653 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 25 15:54:16 crc kubenswrapper[4937]: I0225 15:54:16.933460 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 25 15:54:17 crc kubenswrapper[4937]: I0225 15:54:17.327973 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 25 15:54:17 crc kubenswrapper[4937]: I0225 15:54:17.411204 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 25 15:54:17 crc kubenswrapper[4937]: I0225 15:54:17.447031 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 25 15:54:17 crc kubenswrapper[4937]: I0225 15:54:17.615508 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 25 15:54:18 crc kubenswrapper[4937]: I0225 15:54:18.055693 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 25 15:54:18 crc kubenswrapper[4937]: I0225 15:54:18.071935 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 25 15:54:18 crc kubenswrapper[4937]: I0225 15:54:18.140293 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 25 15:54:18 crc kubenswrapper[4937]: I0225 15:54:18.500734 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 25 15:54:18 crc kubenswrapper[4937]: I0225 15:54:18.948013 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 25 15:54:19 crc kubenswrapper[4937]: I0225 15:54:19.239851 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 25 15:54:19 crc kubenswrapper[4937]: I0225 15:54:19.295508 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 25 15:54:19 crc kubenswrapper[4937]: I0225 15:54:19.570138 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 25 15:54:19 crc kubenswrapper[4937]: I0225 15:54:19.793193 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 25 15:54:20 crc kubenswrapper[4937]: I0225 15:54:20.900820 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 25 15:54:21 crc kubenswrapper[4937]: I0225 15:54:21.271568 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 25 15:54:21 crc kubenswrapper[4937]: I0225 15:54:21.297428 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 25 15:54:21 crc kubenswrapper[4937]: I0225 15:54:21.542886 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 15:54:22 crc kubenswrapper[4937]: I0225 15:54:22.045199 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 25 15:54:22 crc kubenswrapper[4937]: I0225 15:54:22.180340 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 25 15:54:22 crc kubenswrapper[4937]: I0225 15:54:22.425451 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 25 15:54:22 crc kubenswrapper[4937]: I0225 15:54:22.782137 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 25 15:54:23 crc kubenswrapper[4937]: I0225 15:54:23.347094 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 25 15:54:23 crc kubenswrapper[4937]: I0225 15:54:23.537029 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 25 15:54:23 crc kubenswrapper[4937]: I0225 15:54:23.683226 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 25 15:54:23 crc kubenswrapper[4937]: I0225 15:54:23.829736 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 25 15:54:24 crc kubenswrapper[4937]: I0225 15:54:24.882588 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 25 15:54:25 crc kubenswrapper[4937]: I0225 15:54:25.184981 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 25 15:54:25 crc kubenswrapper[4937]: I0225 15:54:25.234644 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 25 15:54:25 crc kubenswrapper[4937]: I0225 15:54:25.943992 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 25 15:54:26 crc kubenswrapper[4937]: I0225 15:54:26.755380 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 25 15:54:26 crc kubenswrapper[4937]: I0225 15:54:26.920637 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 25 15:54:27 crc kubenswrapper[4937]: I0225 15:54:27.278071 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 25 15:54:27 crc kubenswrapper[4937]: I0225 15:54:27.650977 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.142725 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74c6468575-pmbdt"] Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.143159 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" podUID="d1a4d531-e2dd-460e-ba0c-2c6572c5dea4" containerName="controller-manager" containerID="cri-o://0bbeaa7e6741fe652f1a8eb13a12f304dd439e906b21179a15114b20dda839b0" gracePeriod=30 Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.226568 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb"] Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.226787 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" podUID="becb881e-db45-4958-90a0-0d20250d2ff1" containerName="route-controller-manager" containerID="cri-o://df25e3ac8c6cd219bc5ce83338ca874e4499ce3d25028898b77497c221eba258" gracePeriod=30 Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.528607 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.593388 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.671598 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-client-ca\") pod \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.671714 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-proxy-ca-bundles\") pod \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.671765 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-config\") pod \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.671810 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gwbh\" (UniqueName: \"kubernetes.io/projected/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-kube-api-access-6gwbh\") pod \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.671866 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-serving-cert\") pod \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\" (UID: \"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4\") " Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.672558 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-client-ca" (OuterVolumeSpecName: "client-ca") pod "d1a4d531-e2dd-460e-ba0c-2c6572c5dea4" (UID: "d1a4d531-e2dd-460e-ba0c-2c6572c5dea4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.672616 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d1a4d531-e2dd-460e-ba0c-2c6572c5dea4" (UID: "d1a4d531-e2dd-460e-ba0c-2c6572c5dea4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.672672 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-config" (OuterVolumeSpecName: "config") pod "d1a4d531-e2dd-460e-ba0c-2c6572c5dea4" (UID: "d1a4d531-e2dd-460e-ba0c-2c6572c5dea4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.677631 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-kube-api-access-6gwbh" (OuterVolumeSpecName: "kube-api-access-6gwbh") pod "d1a4d531-e2dd-460e-ba0c-2c6572c5dea4" (UID: "d1a4d531-e2dd-460e-ba0c-2c6572c5dea4"). InnerVolumeSpecName "kube-api-access-6gwbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.677792 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d1a4d531-e2dd-460e-ba0c-2c6572c5dea4" (UID: "d1a4d531-e2dd-460e-ba0c-2c6572c5dea4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.773425 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/becb881e-db45-4958-90a0-0d20250d2ff1-serving-cert\") pod \"becb881e-db45-4958-90a0-0d20250d2ff1\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.773546 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/becb881e-db45-4958-90a0-0d20250d2ff1-config\") pod \"becb881e-db45-4958-90a0-0d20250d2ff1\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.773602 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/becb881e-db45-4958-90a0-0d20250d2ff1-client-ca\") pod \"becb881e-db45-4958-90a0-0d20250d2ff1\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.773698 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8szzd\" (UniqueName: \"kubernetes.io/projected/becb881e-db45-4958-90a0-0d20250d2ff1-kube-api-access-8szzd\") pod \"becb881e-db45-4958-90a0-0d20250d2ff1\" (UID: \"becb881e-db45-4958-90a0-0d20250d2ff1\") " Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.773950 4937 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.773974 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.773987 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gwbh\" (UniqueName: \"kubernetes.io/projected/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-kube-api-access-6gwbh\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.774000 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.774010 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.774570 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/becb881e-db45-4958-90a0-0d20250d2ff1-client-ca" (OuterVolumeSpecName: "client-ca") pod "becb881e-db45-4958-90a0-0d20250d2ff1" (UID: "becb881e-db45-4958-90a0-0d20250d2ff1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.774683 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/becb881e-db45-4958-90a0-0d20250d2ff1-config" (OuterVolumeSpecName: "config") pod "becb881e-db45-4958-90a0-0d20250d2ff1" (UID: "becb881e-db45-4958-90a0-0d20250d2ff1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.777151 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/becb881e-db45-4958-90a0-0d20250d2ff1-kube-api-access-8szzd" (OuterVolumeSpecName: "kube-api-access-8szzd") pod "becb881e-db45-4958-90a0-0d20250d2ff1" (UID: "becb881e-db45-4958-90a0-0d20250d2ff1"). InnerVolumeSpecName "kube-api-access-8szzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.778039 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/becb881e-db45-4958-90a0-0d20250d2ff1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "becb881e-db45-4958-90a0-0d20250d2ff1" (UID: "becb881e-db45-4958-90a0-0d20250d2ff1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.875269 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8szzd\" (UniqueName: \"kubernetes.io/projected/becb881e-db45-4958-90a0-0d20250d2ff1-kube-api-access-8szzd\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.875315 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/becb881e-db45-4958-90a0-0d20250d2ff1-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.875329 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/becb881e-db45-4958-90a0-0d20250d2ff1-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:28 crc kubenswrapper[4937]: I0225 15:54:28.875343 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/becb881e-db45-4958-90a0-0d20250d2ff1-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.049946 4937 generic.go:334] "Generic (PLEG): container finished" podID="becb881e-db45-4958-90a0-0d20250d2ff1" containerID="df25e3ac8c6cd219bc5ce83338ca874e4499ce3d25028898b77497c221eba258" exitCode=0 Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.050037 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.050020 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" event={"ID":"becb881e-db45-4958-90a0-0d20250d2ff1","Type":"ContainerDied","Data":"df25e3ac8c6cd219bc5ce83338ca874e4499ce3d25028898b77497c221eba258"} Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.050244 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb" event={"ID":"becb881e-db45-4958-90a0-0d20250d2ff1","Type":"ContainerDied","Data":"668898a46515f27931afa571ec096fb1253b101be70fb535651109e43fb5d48f"} Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.050312 4937 scope.go:117] "RemoveContainer" containerID="df25e3ac8c6cd219bc5ce83338ca874e4499ce3d25028898b77497c221eba258" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.051819 4937 generic.go:334] "Generic (PLEG): container finished" podID="d1a4d531-e2dd-460e-ba0c-2c6572c5dea4" containerID="0bbeaa7e6741fe652f1a8eb13a12f304dd439e906b21179a15114b20dda839b0" exitCode=0 Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.051870 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" event={"ID":"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4","Type":"ContainerDied","Data":"0bbeaa7e6741fe652f1a8eb13a12f304dd439e906b21179a15114b20dda839b0"} Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.051912 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.051923 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74c6468575-pmbdt" event={"ID":"d1a4d531-e2dd-460e-ba0c-2c6572c5dea4","Type":"ContainerDied","Data":"2acdb5949b753bb791267d5f873350f07d7865397a6ef6fe01e4e3154d4eb13e"} Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.069029 4937 scope.go:117] "RemoveContainer" containerID="df25e3ac8c6cd219bc5ce83338ca874e4499ce3d25028898b77497c221eba258" Feb 25 15:54:29 crc kubenswrapper[4937]: E0225 15:54:29.069992 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df25e3ac8c6cd219bc5ce83338ca874e4499ce3d25028898b77497c221eba258\": container with ID starting with df25e3ac8c6cd219bc5ce83338ca874e4499ce3d25028898b77497c221eba258 not found: ID does not exist" containerID="df25e3ac8c6cd219bc5ce83338ca874e4499ce3d25028898b77497c221eba258" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.070042 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df25e3ac8c6cd219bc5ce83338ca874e4499ce3d25028898b77497c221eba258"} err="failed to get container status \"df25e3ac8c6cd219bc5ce83338ca874e4499ce3d25028898b77497c221eba258\": rpc error: code = NotFound desc = could not find container \"df25e3ac8c6cd219bc5ce83338ca874e4499ce3d25028898b77497c221eba258\": container with ID starting with df25e3ac8c6cd219bc5ce83338ca874e4499ce3d25028898b77497c221eba258 not found: ID does not exist" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.070075 4937 scope.go:117] "RemoveContainer" containerID="0bbeaa7e6741fe652f1a8eb13a12f304dd439e906b21179a15114b20dda839b0" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.094551 4937 scope.go:117] "RemoveContainer" containerID="0bbeaa7e6741fe652f1a8eb13a12f304dd439e906b21179a15114b20dda839b0" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.095325 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb"] Feb 25 15:54:29 crc kubenswrapper[4937]: E0225 15:54:29.097169 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bbeaa7e6741fe652f1a8eb13a12f304dd439e906b21179a15114b20dda839b0\": container with ID starting with 0bbeaa7e6741fe652f1a8eb13a12f304dd439e906b21179a15114b20dda839b0 not found: ID does not exist" containerID="0bbeaa7e6741fe652f1a8eb13a12f304dd439e906b21179a15114b20dda839b0" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.097341 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bbeaa7e6741fe652f1a8eb13a12f304dd439e906b21179a15114b20dda839b0"} err="failed to get container status \"0bbeaa7e6741fe652f1a8eb13a12f304dd439e906b21179a15114b20dda839b0\": rpc error: code = NotFound desc = could not find container \"0bbeaa7e6741fe652f1a8eb13a12f304dd439e906b21179a15114b20dda839b0\": container with ID starting with 0bbeaa7e6741fe652f1a8eb13a12f304dd439e906b21179a15114b20dda839b0 not found: ID does not exist" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.102753 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65cb4f6c9c-2z2lb"] Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.106930 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74c6468575-pmbdt"] Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.110103 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74c6468575-pmbdt"] Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.250823 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b68559b99-thmd2"] Feb 25 15:54:29 crc kubenswrapper[4937]: E0225 15:54:29.251073 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a4d531-e2dd-460e-ba0c-2c6572c5dea4" containerName="controller-manager" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.251088 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a4d531-e2dd-460e-ba0c-2c6572c5dea4" containerName="controller-manager" Feb 25 15:54:29 crc kubenswrapper[4937]: E0225 15:54:29.251104 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="becb881e-db45-4958-90a0-0d20250d2ff1" containerName="route-controller-manager" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.251113 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="becb881e-db45-4958-90a0-0d20250d2ff1" containerName="route-controller-manager" Feb 25 15:54:29 crc kubenswrapper[4937]: E0225 15:54:29.251132 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.251142 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.251252 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.251269 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a4d531-e2dd-460e-ba0c-2c6572c5dea4" containerName="controller-manager" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.251282 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="becb881e-db45-4958-90a0-0d20250d2ff1" containerName="route-controller-manager" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.251705 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.254008 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.254217 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.254373 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.254594 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.254823 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.257861 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.264514 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.379226 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="becb881e-db45-4958-90a0-0d20250d2ff1" path="/var/lib/kubelet/pods/becb881e-db45-4958-90a0-0d20250d2ff1/volumes" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.380334 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a4d531-e2dd-460e-ba0c-2c6572c5dea4" path="/var/lib/kubelet/pods/d1a4d531-e2dd-460e-ba0c-2c6572c5dea4/volumes" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.387100 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08618220-3367-4c68-94a1-00005a969932-serving-cert\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.387438 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75hvw\" (UniqueName: \"kubernetes.io/projected/08618220-3367-4c68-94a1-00005a969932-kube-api-access-75hvw\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.387665 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-proxy-ca-bundles\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.387866 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-config\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.388075 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-client-ca\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.490015 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75hvw\" (UniqueName: \"kubernetes.io/projected/08618220-3367-4c68-94a1-00005a969932-kube-api-access-75hvw\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.490110 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-proxy-ca-bundles\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.490209 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-config\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.490286 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-client-ca\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.490397 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08618220-3367-4c68-94a1-00005a969932-serving-cert\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.492411 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-proxy-ca-bundles\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.492473 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-client-ca\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.493576 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-config\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.496807 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08618220-3367-4c68-94a1-00005a969932-serving-cert\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.528088 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75hvw\" (UniqueName: \"kubernetes.io/projected/08618220-3367-4c68-94a1-00005a969932-kube-api-access-75hvw\") pod \"controller-manager-7b68559b99-thmd2\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:29 crc kubenswrapper[4937]: I0225 15:54:29.595404 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.250057 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x"] Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.251284 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.253642 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.253809 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.253989 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.254118 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.257705 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.258640 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.299847 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72438e28-0ed7-400b-9978-73397624365a-config\") pod \"route-controller-manager-5bbf878bb9-vng9x\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.299910 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72438e28-0ed7-400b-9978-73397624365a-serving-cert\") pod \"route-controller-manager-5bbf878bb9-vng9x\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.300004 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72438e28-0ed7-400b-9978-73397624365a-client-ca\") pod \"route-controller-manager-5bbf878bb9-vng9x\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.300040 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv9s6\" (UniqueName: \"kubernetes.io/projected/72438e28-0ed7-400b-9978-73397624365a-kube-api-access-nv9s6\") pod \"route-controller-manager-5bbf878bb9-vng9x\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.401850 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72438e28-0ed7-400b-9978-73397624365a-config\") pod \"route-controller-manager-5bbf878bb9-vng9x\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.401916 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72438e28-0ed7-400b-9978-73397624365a-serving-cert\") pod \"route-controller-manager-5bbf878bb9-vng9x\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.402022 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72438e28-0ed7-400b-9978-73397624365a-client-ca\") pod \"route-controller-manager-5bbf878bb9-vng9x\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.402059 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv9s6\" (UniqueName: \"kubernetes.io/projected/72438e28-0ed7-400b-9978-73397624365a-kube-api-access-nv9s6\") pod \"route-controller-manager-5bbf878bb9-vng9x\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.403744 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72438e28-0ed7-400b-9978-73397624365a-client-ca\") pod \"route-controller-manager-5bbf878bb9-vng9x\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.404905 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72438e28-0ed7-400b-9978-73397624365a-config\") pod \"route-controller-manager-5bbf878bb9-vng9x\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.412245 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72438e28-0ed7-400b-9978-73397624365a-serving-cert\") pod \"route-controller-manager-5bbf878bb9-vng9x\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.426476 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv9s6\" (UniqueName: \"kubernetes.io/projected/72438e28-0ed7-400b-9978-73397624365a-kube-api-access-nv9s6\") pod \"route-controller-manager-5bbf878bb9-vng9x\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:30 crc kubenswrapper[4937]: I0225 15:54:30.571397 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:32 crc kubenswrapper[4937]: I0225 15:54:32.014212 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533912-7rslj"] Feb 25 15:54:32 crc kubenswrapper[4937]: I0225 15:54:32.019346 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533914-nfxdw"] Feb 25 15:54:32 crc kubenswrapper[4937]: I0225 15:54:32.022767 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b68559b99-thmd2"] Feb 25 15:54:32 crc kubenswrapper[4937]: I0225 15:54:32.036889 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x"] Feb 25 15:54:32 crc kubenswrapper[4937]: I0225 15:54:32.266389 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x"] Feb 25 15:54:32 crc kubenswrapper[4937]: I0225 15:54:32.324097 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533912-7rslj"] Feb 25 15:54:32 crc kubenswrapper[4937]: W0225 15:54:32.334071 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cf00929_ff8b_42c5_96f8_6f02e52372ae.slice/crio-d22fb17ec0c5868e47aa850632cc42f4b76058dd582ebecf690cc27f448a9b61 WatchSource:0}: Error finding container d22fb17ec0c5868e47aa850632cc42f4b76058dd582ebecf690cc27f448a9b61: Status 404 returned error can't find the container with id d22fb17ec0c5868e47aa850632cc42f4b76058dd582ebecf690cc27f448a9b61 Feb 25 15:54:32 crc kubenswrapper[4937]: I0225 15:54:32.357530 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b68559b99-thmd2"] Feb 25 15:54:32 crc kubenswrapper[4937]: W0225 15:54:32.377543 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08618220_3367_4c68_94a1_00005a969932.slice/crio-ece33d2cbf7729b9097d8eea1f1d1fa0e7284fe9a3985f9db8b15c52748c4648 WatchSource:0}: Error finding container ece33d2cbf7729b9097d8eea1f1d1fa0e7284fe9a3985f9db8b15c52748c4648: Status 404 returned error can't find the container with id ece33d2cbf7729b9097d8eea1f1d1fa0e7284fe9a3985f9db8b15c52748c4648 Feb 25 15:54:32 crc kubenswrapper[4937]: I0225 15:54:32.506181 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533914-nfxdw"] Feb 25 15:54:32 crc kubenswrapper[4937]: W0225 15:54:32.514902 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode90b54e9_9441_4afc_a28d_6febea150a04.slice/crio-28818a8320857523132e85cdbd11d82f4d5d4cb577909aa8d0f9a8c4a2abf7fc WatchSource:0}: Error finding container 28818a8320857523132e85cdbd11d82f4d5d4cb577909aa8d0f9a8c4a2abf7fc: Status 404 returned error can't find the container with id 28818a8320857523132e85cdbd11d82f4d5d4cb577909aa8d0f9a8c4a2abf7fc Feb 25 15:54:33 crc kubenswrapper[4937]: I0225 15:54:33.075880 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533912-7rslj" event={"ID":"6cf00929-ff8b-42c5-96f8-6f02e52372ae","Type":"ContainerStarted","Data":"d22fb17ec0c5868e47aa850632cc42f4b76058dd582ebecf690cc27f448a9b61"} Feb 25 15:54:33 crc kubenswrapper[4937]: I0225 15:54:33.077732 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" event={"ID":"08618220-3367-4c68-94a1-00005a969932","Type":"ContainerStarted","Data":"99bc294acdc2ceea43700d72cecad9ab8c3b92393d051b4de6bf1cd4d772d603"} Feb 25 15:54:33 crc kubenswrapper[4937]: I0225 15:54:33.077843 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" event={"ID":"08618220-3367-4c68-94a1-00005a969932","Type":"ContainerStarted","Data":"ece33d2cbf7729b9097d8eea1f1d1fa0e7284fe9a3985f9db8b15c52748c4648"} Feb 25 15:54:33 crc kubenswrapper[4937]: I0225 15:54:33.077912 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:33 crc kubenswrapper[4937]: I0225 15:54:33.079086 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533914-nfxdw" event={"ID":"e90b54e9-9441-4afc-a28d-6febea150a04","Type":"ContainerStarted","Data":"28818a8320857523132e85cdbd11d82f4d5d4cb577909aa8d0f9a8c4a2abf7fc"} Feb 25 15:54:33 crc kubenswrapper[4937]: I0225 15:54:33.082257 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" event={"ID":"72438e28-0ed7-400b-9978-73397624365a","Type":"ContainerStarted","Data":"9445dbb8f7b03169408d3926b873958a0d41294cc2a95bbb635b9fb79eb1fcf1"} Feb 25 15:54:33 crc kubenswrapper[4937]: I0225 15:54:33.082289 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" event={"ID":"72438e28-0ed7-400b-9978-73397624365a","Type":"ContainerStarted","Data":"a62cfca13a1f825cc9ad5c5f9e4809020fa1f7ba3be83427367420e80f86b3ab"} Feb 25 15:54:33 crc kubenswrapper[4937]: I0225 15:54:33.083033 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:33 crc kubenswrapper[4937]: I0225 15:54:33.087602 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:33 crc kubenswrapper[4937]: I0225 15:54:33.131196 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" podStartSLOduration=5.131167777 podStartE2EDuration="5.131167777s" podCreationTimestamp="2026-02-25 15:54:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:54:33.102748512 +0000 UTC m=+524.116140402" watchObservedRunningTime="2026-02-25 15:54:33.131167777 +0000 UTC m=+524.144559667" Feb 25 15:54:33 crc kubenswrapper[4937]: I0225 15:54:33.132021 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" podStartSLOduration=5.1320141679999995 podStartE2EDuration="5.132014168s" podCreationTimestamp="2026-02-25 15:54:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:54:33.1261227 +0000 UTC m=+524.139514640" watchObservedRunningTime="2026-02-25 15:54:33.132014168 +0000 UTC m=+524.145406058" Feb 25 15:54:33 crc kubenswrapper[4937]: I0225 15:54:33.239478 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:34 crc kubenswrapper[4937]: I0225 15:54:34.338774 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b68559b99-thmd2"] Feb 25 15:54:34 crc kubenswrapper[4937]: I0225 15:54:34.347858 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x"] Feb 25 15:54:35 crc kubenswrapper[4937]: I0225 15:54:35.096705 4937 generic.go:334] "Generic (PLEG): container finished" podID="6cf00929-ff8b-42c5-96f8-6f02e52372ae" containerID="51af0abc0a79790cba8e41933bf07cb316171e1d1e922535f0b6f409bf32cad5" exitCode=0 Feb 25 15:54:35 crc kubenswrapper[4937]: I0225 15:54:35.096821 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533912-7rslj" event={"ID":"6cf00929-ff8b-42c5-96f8-6f02e52372ae","Type":"ContainerDied","Data":"51af0abc0a79790cba8e41933bf07cb316171e1d1e922535f0b6f409bf32cad5"} Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.104857 4937 generic.go:334] "Generic (PLEG): container finished" podID="e90b54e9-9441-4afc-a28d-6febea150a04" containerID="b1343e02223c1e86e17c58240086658a4b55da57ea15d8bda5b3e01b5c902978" exitCode=0 Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.105269 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" podUID="72438e28-0ed7-400b-9978-73397624365a" containerName="route-controller-manager" containerID="cri-o://9445dbb8f7b03169408d3926b873958a0d41294cc2a95bbb635b9fb79eb1fcf1" gracePeriod=30 Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.105098 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533914-nfxdw" event={"ID":"e90b54e9-9441-4afc-a28d-6febea150a04","Type":"ContainerDied","Data":"b1343e02223c1e86e17c58240086658a4b55da57ea15d8bda5b3e01b5c902978"} Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.105413 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" podUID="08618220-3367-4c68-94a1-00005a969932" containerName="controller-manager" containerID="cri-o://99bc294acdc2ceea43700d72cecad9ab8c3b92393d051b4de6bf1cd4d772d603" gracePeriod=30 Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.432025 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533912-7rslj" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.478063 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czg5v\" (UniqueName: \"kubernetes.io/projected/6cf00929-ff8b-42c5-96f8-6f02e52372ae-kube-api-access-czg5v\") pod \"6cf00929-ff8b-42c5-96f8-6f02e52372ae\" (UID: \"6cf00929-ff8b-42c5-96f8-6f02e52372ae\") " Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.498646 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf00929-ff8b-42c5-96f8-6f02e52372ae-kube-api-access-czg5v" (OuterVolumeSpecName: "kube-api-access-czg5v") pod "6cf00929-ff8b-42c5-96f8-6f02e52372ae" (UID: "6cf00929-ff8b-42c5-96f8-6f02e52372ae"). InnerVolumeSpecName "kube-api-access-czg5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.576145 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.578717 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czg5v\" (UniqueName: \"kubernetes.io/projected/6cf00929-ff8b-42c5-96f8-6f02e52372ae-kube-api-access-czg5v\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.637988 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.679363 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08618220-3367-4c68-94a1-00005a969932-serving-cert\") pod \"08618220-3367-4c68-94a1-00005a969932\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.679593 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-proxy-ca-bundles\") pod \"08618220-3367-4c68-94a1-00005a969932\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.679657 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72438e28-0ed7-400b-9978-73397624365a-serving-cert\") pod \"72438e28-0ed7-400b-9978-73397624365a\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.679684 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-config\") pod \"08618220-3367-4c68-94a1-00005a969932\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.679706 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv9s6\" (UniqueName: \"kubernetes.io/projected/72438e28-0ed7-400b-9978-73397624365a-kube-api-access-nv9s6\") pod \"72438e28-0ed7-400b-9978-73397624365a\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.679777 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75hvw\" (UniqueName: \"kubernetes.io/projected/08618220-3367-4c68-94a1-00005a969932-kube-api-access-75hvw\") pod \"08618220-3367-4c68-94a1-00005a969932\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.679803 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-client-ca\") pod \"08618220-3367-4c68-94a1-00005a969932\" (UID: \"08618220-3367-4c68-94a1-00005a969932\") " Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.679854 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72438e28-0ed7-400b-9978-73397624365a-client-ca\") pod \"72438e28-0ed7-400b-9978-73397624365a\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.680927 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-client-ca" (OuterVolumeSpecName: "client-ca") pod "08618220-3367-4c68-94a1-00005a969932" (UID: "08618220-3367-4c68-94a1-00005a969932"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.681102 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72438e28-0ed7-400b-9978-73397624365a-client-ca" (OuterVolumeSpecName: "client-ca") pod "72438e28-0ed7-400b-9978-73397624365a" (UID: "72438e28-0ed7-400b-9978-73397624365a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.681428 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-config" (OuterVolumeSpecName: "config") pod "08618220-3367-4c68-94a1-00005a969932" (UID: "08618220-3367-4c68-94a1-00005a969932"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.682083 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "08618220-3367-4c68-94a1-00005a969932" (UID: "08618220-3367-4c68-94a1-00005a969932"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.682270 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08618220-3367-4c68-94a1-00005a969932-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "08618220-3367-4c68-94a1-00005a969932" (UID: "08618220-3367-4c68-94a1-00005a969932"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.683307 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72438e28-0ed7-400b-9978-73397624365a-kube-api-access-nv9s6" (OuterVolumeSpecName: "kube-api-access-nv9s6") pod "72438e28-0ed7-400b-9978-73397624365a" (UID: "72438e28-0ed7-400b-9978-73397624365a"). InnerVolumeSpecName "kube-api-access-nv9s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.683355 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08618220-3367-4c68-94a1-00005a969932-kube-api-access-75hvw" (OuterVolumeSpecName: "kube-api-access-75hvw") pod "08618220-3367-4c68-94a1-00005a969932" (UID: "08618220-3367-4c68-94a1-00005a969932"). InnerVolumeSpecName "kube-api-access-75hvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.683432 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72438e28-0ed7-400b-9978-73397624365a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "72438e28-0ed7-400b-9978-73397624365a" (UID: "72438e28-0ed7-400b-9978-73397624365a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.781105 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72438e28-0ed7-400b-9978-73397624365a-config\") pod \"72438e28-0ed7-400b-9978-73397624365a\" (UID: \"72438e28-0ed7-400b-9978-73397624365a\") " Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.781440 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08618220-3367-4c68-94a1-00005a969932-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.781469 4937 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.781512 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72438e28-0ed7-400b-9978-73397624365a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.781530 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.781547 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv9s6\" (UniqueName: \"kubernetes.io/projected/72438e28-0ed7-400b-9978-73397624365a-kube-api-access-nv9s6\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.781566 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75hvw\" (UniqueName: \"kubernetes.io/projected/08618220-3367-4c68-94a1-00005a969932-kube-api-access-75hvw\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.781582 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08618220-3367-4c68-94a1-00005a969932-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.781598 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72438e28-0ed7-400b-9978-73397624365a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.781657 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72438e28-0ed7-400b-9978-73397624365a-config" (OuterVolumeSpecName: "config") pod "72438e28-0ed7-400b-9978-73397624365a" (UID: "72438e28-0ed7-400b-9978-73397624365a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:36 crc kubenswrapper[4937]: I0225 15:54:36.882713 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72438e28-0ed7-400b-9978-73397624365a-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.112532 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533912-7rslj" event={"ID":"6cf00929-ff8b-42c5-96f8-6f02e52372ae","Type":"ContainerDied","Data":"d22fb17ec0c5868e47aa850632cc42f4b76058dd582ebecf690cc27f448a9b61"} Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.113606 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d22fb17ec0c5868e47aa850632cc42f4b76058dd582ebecf690cc27f448a9b61" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.112560 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533912-7rslj" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.114028 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.113972 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" event={"ID":"08618220-3367-4c68-94a1-00005a969932","Type":"ContainerDied","Data":"99bc294acdc2ceea43700d72cecad9ab8c3b92393d051b4de6bf1cd4d772d603"} Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.113952 4937 generic.go:334] "Generic (PLEG): container finished" podID="08618220-3367-4c68-94a1-00005a969932" containerID="99bc294acdc2ceea43700d72cecad9ab8c3b92393d051b4de6bf1cd4d772d603" exitCode=0 Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.114095 4937 scope.go:117] "RemoveContainer" containerID="99bc294acdc2ceea43700d72cecad9ab8c3b92393d051b4de6bf1cd4d772d603" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.114177 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b68559b99-thmd2" event={"ID":"08618220-3367-4c68-94a1-00005a969932","Type":"ContainerDied","Data":"ece33d2cbf7729b9097d8eea1f1d1fa0e7284fe9a3985f9db8b15c52748c4648"} Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.116508 4937 generic.go:334] "Generic (PLEG): container finished" podID="72438e28-0ed7-400b-9978-73397624365a" containerID="9445dbb8f7b03169408d3926b873958a0d41294cc2a95bbb635b9fb79eb1fcf1" exitCode=0 Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.116773 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.118775 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" event={"ID":"72438e28-0ed7-400b-9978-73397624365a","Type":"ContainerDied","Data":"9445dbb8f7b03169408d3926b873958a0d41294cc2a95bbb635b9fb79eb1fcf1"} Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.118869 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x" event={"ID":"72438e28-0ed7-400b-9978-73397624365a","Type":"ContainerDied","Data":"a62cfca13a1f825cc9ad5c5f9e4809020fa1f7ba3be83427367420e80f86b3ab"} Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.135272 4937 scope.go:117] "RemoveContainer" containerID="99bc294acdc2ceea43700d72cecad9ab8c3b92393d051b4de6bf1cd4d772d603" Feb 25 15:54:37 crc kubenswrapper[4937]: E0225 15:54:37.135714 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99bc294acdc2ceea43700d72cecad9ab8c3b92393d051b4de6bf1cd4d772d603\": container with ID starting with 99bc294acdc2ceea43700d72cecad9ab8c3b92393d051b4de6bf1cd4d772d603 not found: ID does not exist" containerID="99bc294acdc2ceea43700d72cecad9ab8c3b92393d051b4de6bf1cd4d772d603" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.135746 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99bc294acdc2ceea43700d72cecad9ab8c3b92393d051b4de6bf1cd4d772d603"} err="failed to get container status \"99bc294acdc2ceea43700d72cecad9ab8c3b92393d051b4de6bf1cd4d772d603\": rpc error: code = NotFound desc = could not find container \"99bc294acdc2ceea43700d72cecad9ab8c3b92393d051b4de6bf1cd4d772d603\": container with ID starting with 99bc294acdc2ceea43700d72cecad9ab8c3b92393d051b4de6bf1cd4d772d603 not found: ID does not exist" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.135769 4937 scope.go:117] "RemoveContainer" containerID="9445dbb8f7b03169408d3926b873958a0d41294cc2a95bbb635b9fb79eb1fcf1" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.155747 4937 scope.go:117] "RemoveContainer" containerID="9445dbb8f7b03169408d3926b873958a0d41294cc2a95bbb635b9fb79eb1fcf1" Feb 25 15:54:37 crc kubenswrapper[4937]: E0225 15:54:37.156262 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9445dbb8f7b03169408d3926b873958a0d41294cc2a95bbb635b9fb79eb1fcf1\": container with ID starting with 9445dbb8f7b03169408d3926b873958a0d41294cc2a95bbb635b9fb79eb1fcf1 not found: ID does not exist" containerID="9445dbb8f7b03169408d3926b873958a0d41294cc2a95bbb635b9fb79eb1fcf1" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.156343 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9445dbb8f7b03169408d3926b873958a0d41294cc2a95bbb635b9fb79eb1fcf1"} err="failed to get container status \"9445dbb8f7b03169408d3926b873958a0d41294cc2a95bbb635b9fb79eb1fcf1\": rpc error: code = NotFound desc = could not find container \"9445dbb8f7b03169408d3926b873958a0d41294cc2a95bbb635b9fb79eb1fcf1\": container with ID starting with 9445dbb8f7b03169408d3926b873958a0d41294cc2a95bbb635b9fb79eb1fcf1 not found: ID does not exist" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.167318 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x"] Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.175646 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bbf878bb9-vng9x"] Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.181016 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b68559b99-thmd2"] Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.185297 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b68559b99-thmd2"] Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.303237 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533914-nfxdw" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.374406 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08618220-3367-4c68-94a1-00005a969932" path="/var/lib/kubelet/pods/08618220-3367-4c68-94a1-00005a969932/volumes" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.375016 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72438e28-0ed7-400b-9978-73397624365a" path="/var/lib/kubelet/pods/72438e28-0ed7-400b-9978-73397624365a/volumes" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.488812 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qzxf\" (UniqueName: \"kubernetes.io/projected/e90b54e9-9441-4afc-a28d-6febea150a04-kube-api-access-2qzxf\") pod \"e90b54e9-9441-4afc-a28d-6febea150a04\" (UID: \"e90b54e9-9441-4afc-a28d-6febea150a04\") " Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.495811 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e90b54e9-9441-4afc-a28d-6febea150a04-kube-api-access-2qzxf" (OuterVolumeSpecName: "kube-api-access-2qzxf") pod "e90b54e9-9441-4afc-a28d-6febea150a04" (UID: "e90b54e9-9441-4afc-a28d-6febea150a04"). InnerVolumeSpecName "kube-api-access-2qzxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:54:37 crc kubenswrapper[4937]: I0225 15:54:37.590005 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qzxf\" (UniqueName: \"kubernetes.io/projected/e90b54e9-9441-4afc-a28d-6febea150a04-kube-api-access-2qzxf\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:38 crc kubenswrapper[4937]: I0225 15:54:38.129092 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533914-nfxdw" event={"ID":"e90b54e9-9441-4afc-a28d-6febea150a04","Type":"ContainerDied","Data":"28818a8320857523132e85cdbd11d82f4d5d4cb577909aa8d0f9a8c4a2abf7fc"} Feb 25 15:54:38 crc kubenswrapper[4937]: I0225 15:54:38.129604 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28818a8320857523132e85cdbd11d82f4d5d4cb577909aa8d0f9a8c4a2abf7fc" Feb 25 15:54:38 crc kubenswrapper[4937]: I0225 15:54:38.129170 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533914-nfxdw" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.255630 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll"] Feb 25 15:54:39 crc kubenswrapper[4937]: E0225 15:54:39.255868 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72438e28-0ed7-400b-9978-73397624365a" containerName="route-controller-manager" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.255883 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="72438e28-0ed7-400b-9978-73397624365a" containerName="route-controller-manager" Feb 25 15:54:39 crc kubenswrapper[4937]: E0225 15:54:39.255899 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90b54e9-9441-4afc-a28d-6febea150a04" containerName="oc" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.255905 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90b54e9-9441-4afc-a28d-6febea150a04" containerName="oc" Feb 25 15:54:39 crc kubenswrapper[4937]: E0225 15:54:39.255915 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08618220-3367-4c68-94a1-00005a969932" containerName="controller-manager" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.255922 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="08618220-3367-4c68-94a1-00005a969932" containerName="controller-manager" Feb 25 15:54:39 crc kubenswrapper[4937]: E0225 15:54:39.255931 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf00929-ff8b-42c5-96f8-6f02e52372ae" containerName="oc" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.255936 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf00929-ff8b-42c5-96f8-6f02e52372ae" containerName="oc" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.256024 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="72438e28-0ed7-400b-9978-73397624365a" containerName="route-controller-manager" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.256036 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e90b54e9-9441-4afc-a28d-6febea150a04" containerName="oc" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.256047 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="08618220-3367-4c68-94a1-00005a969932" containerName="controller-manager" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.256054 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf00929-ff8b-42c5-96f8-6f02e52372ae" containerName="oc" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.256411 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.259262 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.259548 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.259846 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.259964 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.260073 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.260247 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.265452 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5856d46cfd-lg5r8"] Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.273638 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5856d46cfd-lg5r8"] Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.273764 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.275904 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.276148 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.276314 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.276469 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.276718 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.276965 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.278182 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll"] Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.281263 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.318374 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-config\") pod \"route-controller-manager-55ddcbdf69-4l2ll\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.318508 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq2xq\" (UniqueName: \"kubernetes.io/projected/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-kube-api-access-sq2xq\") pod \"route-controller-manager-55ddcbdf69-4l2ll\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.318543 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-serving-cert\") pod \"route-controller-manager-55ddcbdf69-4l2ll\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.318571 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-client-ca\") pod \"route-controller-manager-55ddcbdf69-4l2ll\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.318602 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-client-ca\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.318633 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frxbd\" (UniqueName: \"kubernetes.io/projected/b9ec53a0-9ace-4624-a338-76c2bb73599b-kube-api-access-frxbd\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.318658 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9ec53a0-9ace-4624-a338-76c2bb73599b-serving-cert\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.318684 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-config\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.318712 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-proxy-ca-bundles\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.419252 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-proxy-ca-bundles\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.419307 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-config\") pod \"route-controller-manager-55ddcbdf69-4l2ll\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.419532 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq2xq\" (UniqueName: \"kubernetes.io/projected/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-kube-api-access-sq2xq\") pod \"route-controller-manager-55ddcbdf69-4l2ll\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.419573 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-serving-cert\") pod \"route-controller-manager-55ddcbdf69-4l2ll\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.419601 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-client-ca\") pod \"route-controller-manager-55ddcbdf69-4l2ll\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.419635 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-client-ca\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.419668 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frxbd\" (UniqueName: \"kubernetes.io/projected/b9ec53a0-9ace-4624-a338-76c2bb73599b-kube-api-access-frxbd\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.419693 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9ec53a0-9ace-4624-a338-76c2bb73599b-serving-cert\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.419717 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-config\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.420467 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-client-ca\") pod \"route-controller-manager-55ddcbdf69-4l2ll\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.420538 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-config\") pod \"route-controller-manager-55ddcbdf69-4l2ll\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.421288 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-proxy-ca-bundles\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.421600 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-config\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.421764 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-client-ca\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.425355 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9ec53a0-9ace-4624-a338-76c2bb73599b-serving-cert\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.427136 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-serving-cert\") pod \"route-controller-manager-55ddcbdf69-4l2ll\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.437339 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq2xq\" (UniqueName: \"kubernetes.io/projected/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-kube-api-access-sq2xq\") pod \"route-controller-manager-55ddcbdf69-4l2ll\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.442911 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frxbd\" (UniqueName: \"kubernetes.io/projected/b9ec53a0-9ace-4624-a338-76c2bb73599b-kube-api-access-frxbd\") pod \"controller-manager-5856d46cfd-lg5r8\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.622477 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.634655 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.876353 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll"] Feb 25 15:54:39 crc kubenswrapper[4937]: W0225 15:54:39.886235 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52f3eb53_e7cc_41bb_b54e_a004fe4a2bf0.slice/crio-dd6561987215bec8b01b9a6629880d8114ec97a5b2e9b527ae4ce1cfba7c8c4e WatchSource:0}: Error finding container dd6561987215bec8b01b9a6629880d8114ec97a5b2e9b527ae4ce1cfba7c8c4e: Status 404 returned error can't find the container with id dd6561987215bec8b01b9a6629880d8114ec97a5b2e9b527ae4ce1cfba7c8c4e Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.906154 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5856d46cfd-lg5r8"] Feb 25 15:54:39 crc kubenswrapper[4937]: W0225 15:54:39.913876 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9ec53a0_9ace_4624_a338_76c2bb73599b.slice/crio-5ce6fa94db27fc76ec1cd87b26ca6d87634e7c35dae7e7c9336baba568497cfb WatchSource:0}: Error finding container 5ce6fa94db27fc76ec1cd87b26ca6d87634e7c35dae7e7c9336baba568497cfb: Status 404 returned error can't find the container with id 5ce6fa94db27fc76ec1cd87b26ca6d87634e7c35dae7e7c9336baba568497cfb Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.973140 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533908-nq4vc"] Feb 25 15:54:39 crc kubenswrapper[4937]: I0225 15:54:39.977354 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533908-nq4vc"] Feb 25 15:54:40 crc kubenswrapper[4937]: I0225 15:54:40.142837 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" event={"ID":"b9ec53a0-9ace-4624-a338-76c2bb73599b","Type":"ContainerStarted","Data":"aaa63e3fbd44180fc925b8331799508e5a32295182fc5841d817031f33791dd4"} Feb 25 15:54:40 crc kubenswrapper[4937]: I0225 15:54:40.142919 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" event={"ID":"b9ec53a0-9ace-4624-a338-76c2bb73599b","Type":"ContainerStarted","Data":"5ce6fa94db27fc76ec1cd87b26ca6d87634e7c35dae7e7c9336baba568497cfb"} Feb 25 15:54:40 crc kubenswrapper[4937]: I0225 15:54:40.142955 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:40 crc kubenswrapper[4937]: I0225 15:54:40.144632 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" event={"ID":"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0","Type":"ContainerStarted","Data":"53370b1510d6bb0db4d90ba7fe20b6b3714d8411078b090ee4e809b97713322d"} Feb 25 15:54:40 crc kubenswrapper[4937]: I0225 15:54:40.144655 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" event={"ID":"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0","Type":"ContainerStarted","Data":"dd6561987215bec8b01b9a6629880d8114ec97a5b2e9b527ae4ce1cfba7c8c4e"} Feb 25 15:54:40 crc kubenswrapper[4937]: I0225 15:54:40.144996 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:40 crc kubenswrapper[4937]: I0225 15:54:40.145113 4937 patch_prober.go:28] interesting pod/controller-manager-5856d46cfd-lg5r8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" start-of-body= Feb 25 15:54:40 crc kubenswrapper[4937]: I0225 15:54:40.145162 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" podUID="b9ec53a0-9ace-4624-a338-76c2bb73599b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" Feb 25 15:54:40 crc kubenswrapper[4937]: I0225 15:54:40.146837 4937 patch_prober.go:28] interesting pod/route-controller-manager-55ddcbdf69-4l2ll container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Feb 25 15:54:40 crc kubenswrapper[4937]: I0225 15:54:40.146921 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" podUID="52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Feb 25 15:54:40 crc kubenswrapper[4937]: I0225 15:54:40.161247 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" podStartSLOduration=6.161223397 podStartE2EDuration="6.161223397s" podCreationTimestamp="2026-02-25 15:54:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:54:40.161060573 +0000 UTC m=+531.174452473" watchObservedRunningTime="2026-02-25 15:54:40.161223397 +0000 UTC m=+531.174615307" Feb 25 15:54:40 crc kubenswrapper[4937]: I0225 15:54:40.181868 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" podStartSLOduration=6.181845346 podStartE2EDuration="6.181845346s" podCreationTimestamp="2026-02-25 15:54:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:54:40.177923307 +0000 UTC m=+531.191315197" watchObservedRunningTime="2026-02-25 15:54:40.181845346 +0000 UTC m=+531.195237256" Feb 25 15:54:41 crc kubenswrapper[4937]: I0225 15:54:41.154259 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:41 crc kubenswrapper[4937]: I0225 15:54:41.156300 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:41 crc kubenswrapper[4937]: I0225 15:54:41.374133 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c63f82a-9346-476b-ae17-edb260b2a36f" path="/var/lib/kubelet/pods/0c63f82a-9346-476b-ae17-edb260b2a36f/volumes" Feb 25 15:54:41 crc kubenswrapper[4937]: I0225 15:54:41.495257 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 15:54:41 crc kubenswrapper[4937]: I0225 15:54:41.495318 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 15:54:41 crc kubenswrapper[4937]: I0225 15:54:41.495368 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:54:41 crc kubenswrapper[4937]: I0225 15:54:41.496172 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9825ea4151fdf2ed4aea12d0ed9b0d4287c1ad0da21c88a4d5b343d65fcffef"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 15:54:41 crc kubenswrapper[4937]: I0225 15:54:41.496255 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://c9825ea4151fdf2ed4aea12d0ed9b0d4287c1ad0da21c88a4d5b343d65fcffef" gracePeriod=600 Feb 25 15:54:42 crc kubenswrapper[4937]: I0225 15:54:42.159105 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="c9825ea4151fdf2ed4aea12d0ed9b0d4287c1ad0da21c88a4d5b343d65fcffef" exitCode=0 Feb 25 15:54:42 crc kubenswrapper[4937]: I0225 15:54:42.159198 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"c9825ea4151fdf2ed4aea12d0ed9b0d4287c1ad0da21c88a4d5b343d65fcffef"} Feb 25 15:54:42 crc kubenswrapper[4937]: I0225 15:54:42.159496 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"81075b52eae9291c3100b4940aeb92ab7d1af5e10120db2c4bfe3a84072fc970"} Feb 25 15:54:42 crc kubenswrapper[4937]: I0225 15:54:42.159522 4937 scope.go:117] "RemoveContainer" containerID="2321e689631270d3475f358c44d33a082f0e876596f0e857761edc28416d7255" Feb 25 15:54:54 crc kubenswrapper[4937]: I0225 15:54:54.352338 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5856d46cfd-lg5r8"] Feb 25 15:54:54 crc kubenswrapper[4937]: I0225 15:54:54.353361 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" podUID="b9ec53a0-9ace-4624-a338-76c2bb73599b" containerName="controller-manager" containerID="cri-o://aaa63e3fbd44180fc925b8331799508e5a32295182fc5841d817031f33791dd4" gracePeriod=30 Feb 25 15:54:54 crc kubenswrapper[4937]: I0225 15:54:54.373320 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll"] Feb 25 15:54:54 crc kubenswrapper[4937]: I0225 15:54:54.373613 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" podUID="52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0" containerName="route-controller-manager" containerID="cri-o://53370b1510d6bb0db4d90ba7fe20b6b3714d8411078b090ee4e809b97713322d" gracePeriod=30 Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.240559 4937 generic.go:334] "Generic (PLEG): container finished" podID="b9ec53a0-9ace-4624-a338-76c2bb73599b" containerID="aaa63e3fbd44180fc925b8331799508e5a32295182fc5841d817031f33791dd4" exitCode=0 Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.240648 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" event={"ID":"b9ec53a0-9ace-4624-a338-76c2bb73599b","Type":"ContainerDied","Data":"aaa63e3fbd44180fc925b8331799508e5a32295182fc5841d817031f33791dd4"} Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.242279 4937 generic.go:334] "Generic (PLEG): container finished" podID="52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0" containerID="53370b1510d6bb0db4d90ba7fe20b6b3714d8411078b090ee4e809b97713322d" exitCode=0 Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.242315 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" event={"ID":"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0","Type":"ContainerDied","Data":"53370b1510d6bb0db4d90ba7fe20b6b3714d8411078b090ee4e809b97713322d"} Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.419917 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.478339 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86"] Feb 25 15:54:55 crc kubenswrapper[4937]: E0225 15:54:55.478696 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0" containerName="route-controller-manager" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.478719 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0" containerName="route-controller-manager" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.478846 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0" containerName="route-controller-manager" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.479434 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.482724 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86"] Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.523252 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-client-ca\") pod \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.523326 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-config\") pod \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.523378 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq2xq\" (UniqueName: \"kubernetes.io/projected/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-kube-api-access-sq2xq\") pod \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.523428 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-serving-cert\") pod \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\" (UID: \"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0\") " Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.525696 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-client-ca" (OuterVolumeSpecName: "client-ca") pod "52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0" (UID: "52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.526455 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-config" (OuterVolumeSpecName: "config") pod "52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0" (UID: "52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.531288 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-kube-api-access-sq2xq" (OuterVolumeSpecName: "kube-api-access-sq2xq") pod "52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0" (UID: "52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0"). InnerVolumeSpecName "kube-api-access-sq2xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.531257 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0" (UID: "52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.555429 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.625540 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frxbd\" (UniqueName: \"kubernetes.io/projected/b9ec53a0-9ace-4624-a338-76c2bb73599b-kube-api-access-frxbd\") pod \"b9ec53a0-9ace-4624-a338-76c2bb73599b\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.625626 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9ec53a0-9ace-4624-a338-76c2bb73599b-serving-cert\") pod \"b9ec53a0-9ace-4624-a338-76c2bb73599b\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.625668 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-config\") pod \"b9ec53a0-9ace-4624-a338-76c2bb73599b\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.625725 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-proxy-ca-bundles\") pod \"b9ec53a0-9ace-4624-a338-76c2bb73599b\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.625754 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-client-ca\") pod \"b9ec53a0-9ace-4624-a338-76c2bb73599b\" (UID: \"b9ec53a0-9ace-4624-a338-76c2bb73599b\") " Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.625936 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d161886-aca0-4216-8cd9-3129713ea236-serving-cert\") pod \"route-controller-manager-64dcfb5d85-9xl86\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.626029 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48zvl\" (UniqueName: \"kubernetes.io/projected/3d161886-aca0-4216-8cd9-3129713ea236-kube-api-access-48zvl\") pod \"route-controller-manager-64dcfb5d85-9xl86\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.626100 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d161886-aca0-4216-8cd9-3129713ea236-config\") pod \"route-controller-manager-64dcfb5d85-9xl86\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.626155 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d161886-aca0-4216-8cd9-3129713ea236-client-ca\") pod \"route-controller-manager-64dcfb5d85-9xl86\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.626225 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.626239 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.626248 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq2xq\" (UniqueName: \"kubernetes.io/projected/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-kube-api-access-sq2xq\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.626256 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.627238 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-client-ca" (OuterVolumeSpecName: "client-ca") pod "b9ec53a0-9ace-4624-a338-76c2bb73599b" (UID: "b9ec53a0-9ace-4624-a338-76c2bb73599b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.627570 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b9ec53a0-9ace-4624-a338-76c2bb73599b" (UID: "b9ec53a0-9ace-4624-a338-76c2bb73599b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.627686 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-config" (OuterVolumeSpecName: "config") pod "b9ec53a0-9ace-4624-a338-76c2bb73599b" (UID: "b9ec53a0-9ace-4624-a338-76c2bb73599b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.632008 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ec53a0-9ace-4624-a338-76c2bb73599b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b9ec53a0-9ace-4624-a338-76c2bb73599b" (UID: "b9ec53a0-9ace-4624-a338-76c2bb73599b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.652590 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ec53a0-9ace-4624-a338-76c2bb73599b-kube-api-access-frxbd" (OuterVolumeSpecName: "kube-api-access-frxbd") pod "b9ec53a0-9ace-4624-a338-76c2bb73599b" (UID: "b9ec53a0-9ace-4624-a338-76c2bb73599b"). InnerVolumeSpecName "kube-api-access-frxbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.726839 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d161886-aca0-4216-8cd9-3129713ea236-serving-cert\") pod \"route-controller-manager-64dcfb5d85-9xl86\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.727226 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48zvl\" (UniqueName: \"kubernetes.io/projected/3d161886-aca0-4216-8cd9-3129713ea236-kube-api-access-48zvl\") pod \"route-controller-manager-64dcfb5d85-9xl86\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.727365 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d161886-aca0-4216-8cd9-3129713ea236-config\") pod \"route-controller-manager-64dcfb5d85-9xl86\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.727407 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d161886-aca0-4216-8cd9-3129713ea236-client-ca\") pod \"route-controller-manager-64dcfb5d85-9xl86\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.727472 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9ec53a0-9ace-4624-a338-76c2bb73599b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.727510 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.727528 4937 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.727544 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9ec53a0-9ace-4624-a338-76c2bb73599b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.727560 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frxbd\" (UniqueName: \"kubernetes.io/projected/b9ec53a0-9ace-4624-a338-76c2bb73599b-kube-api-access-frxbd\") on node \"crc\" DevicePath \"\"" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.728278 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d161886-aca0-4216-8cd9-3129713ea236-client-ca\") pod \"route-controller-manager-64dcfb5d85-9xl86\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.731153 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d161886-aca0-4216-8cd9-3129713ea236-config\") pod \"route-controller-manager-64dcfb5d85-9xl86\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.732422 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d161886-aca0-4216-8cd9-3129713ea236-serving-cert\") pod \"route-controller-manager-64dcfb5d85-9xl86\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.743044 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48zvl\" (UniqueName: \"kubernetes.io/projected/3d161886-aca0-4216-8cd9-3129713ea236-kube-api-access-48zvl\") pod \"route-controller-manager-64dcfb5d85-9xl86\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:55 crc kubenswrapper[4937]: I0225 15:54:55.850743 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.086812 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86"] Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.251657 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" event={"ID":"52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0","Type":"ContainerDied","Data":"dd6561987215bec8b01b9a6629880d8114ec97a5b2e9b527ae4ce1cfba7c8c4e"} Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.252126 4937 scope.go:117] "RemoveContainer" containerID="53370b1510d6bb0db4d90ba7fe20b6b3714d8411078b090ee4e809b97713322d" Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.251713 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll" Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.255703 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.255949 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5856d46cfd-lg5r8" event={"ID":"b9ec53a0-9ace-4624-a338-76c2bb73599b","Type":"ContainerDied","Data":"5ce6fa94db27fc76ec1cd87b26ca6d87634e7c35dae7e7c9336baba568497cfb"} Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.258426 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" event={"ID":"3d161886-aca0-4216-8cd9-3129713ea236","Type":"ContainerStarted","Data":"e5d6eb841663bca7cdb0539af56fd6be594c0b86acb40ac4c0710ae40b885509"} Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.261633 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.262279 4937 patch_prober.go:28] interesting pod/route-controller-manager-64dcfb5d85-9xl86 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": dial tcp 10.217.0.69:8443: connect: connection refused" start-of-body= Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.262330 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" podUID="3d161886-aca0-4216-8cd9-3129713ea236" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": dial tcp 10.217.0.69:8443: connect: connection refused" Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.277984 4937 scope.go:117] "RemoveContainer" containerID="aaa63e3fbd44180fc925b8331799508e5a32295182fc5841d817031f33791dd4" Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.297704 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" podStartSLOduration=2.297630185 podStartE2EDuration="2.297630185s" podCreationTimestamp="2026-02-25 15:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:54:56.27877587 +0000 UTC m=+547.292167850" watchObservedRunningTime="2026-02-25 15:54:56.297630185 +0000 UTC m=+547.311022115" Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.324574 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll"] Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.330679 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55ddcbdf69-4l2ll"] Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.332535 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5856d46cfd-lg5r8"] Feb 25 15:54:56 crc kubenswrapper[4937]: I0225 15:54:56.336175 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5856d46cfd-lg5r8"] Feb 25 15:54:57 crc kubenswrapper[4937]: I0225 15:54:57.267800 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" event={"ID":"3d161886-aca0-4216-8cd9-3129713ea236","Type":"ContainerStarted","Data":"f0ff71a6cc28d47eec2cb71cbb31dac18d9633056b25de8548507d2957c6a20a"} Feb 25 15:54:57 crc kubenswrapper[4937]: I0225 15:54:57.275453 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:54:57 crc kubenswrapper[4937]: I0225 15:54:57.376720 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0" path="/var/lib/kubelet/pods/52f3eb53-e7cc-41bb-b54e-a004fe4a2bf0/volumes" Feb 25 15:54:57 crc kubenswrapper[4937]: I0225 15:54:57.377966 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9ec53a0-9ace-4624-a338-76c2bb73599b" path="/var/lib/kubelet/pods/b9ec53a0-9ace-4624-a338-76c2bb73599b/volumes" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.271878 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-758cc85675-frsf8"] Feb 25 15:54:58 crc kubenswrapper[4937]: E0225 15:54:58.272525 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ec53a0-9ace-4624-a338-76c2bb73599b" containerName="controller-manager" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.272541 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ec53a0-9ace-4624-a338-76c2bb73599b" containerName="controller-manager" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.272672 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ec53a0-9ace-4624-a338-76c2bb73599b" containerName="controller-manager" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.273137 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.278081 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.278372 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.278397 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.278520 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.278565 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.278840 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.286050 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.334582 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-758cc85675-frsf8"] Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.459208 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-config\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.459255 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-proxy-ca-bundles\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.459407 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2vln\" (UniqueName: \"kubernetes.io/projected/abcbcf49-966c-4490-bd42-d89a995f92cf-kube-api-access-c2vln\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.459975 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abcbcf49-966c-4490-bd42-d89a995f92cf-serving-cert\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.460430 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-client-ca\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.562510 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-config\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.562606 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-proxy-ca-bundles\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.562661 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2vln\" (UniqueName: \"kubernetes.io/projected/abcbcf49-966c-4490-bd42-d89a995f92cf-kube-api-access-c2vln\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.562698 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abcbcf49-966c-4490-bd42-d89a995f92cf-serving-cert\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.562738 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-client-ca\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.564215 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-client-ca\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.564583 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-config\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.565277 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-proxy-ca-bundles\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.579407 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abcbcf49-966c-4490-bd42-d89a995f92cf-serving-cert\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.595404 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2vln\" (UniqueName: \"kubernetes.io/projected/abcbcf49-966c-4490-bd42-d89a995f92cf-kube-api-access-c2vln\") pod \"controller-manager-758cc85675-frsf8\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:58 crc kubenswrapper[4937]: I0225 15:54:58.637980 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:59 crc kubenswrapper[4937]: I0225 15:54:59.100162 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-758cc85675-frsf8"] Feb 25 15:54:59 crc kubenswrapper[4937]: I0225 15:54:59.292260 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" event={"ID":"abcbcf49-966c-4490-bd42-d89a995f92cf","Type":"ContainerStarted","Data":"a8ba15a999b4dd4ec4664cef9e9533cb271d41759b42d3be89e608eb12273e14"} Feb 25 15:54:59 crc kubenswrapper[4937]: I0225 15:54:59.292322 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" event={"ID":"abcbcf49-966c-4490-bd42-d89a995f92cf","Type":"ContainerStarted","Data":"3d607f6964459fd9aa14a211b4a10545acfa936b01e52f1d1d812ccd02cf6b52"} Feb 25 15:54:59 crc kubenswrapper[4937]: I0225 15:54:59.292699 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:59 crc kubenswrapper[4937]: I0225 15:54:59.304180 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:54:59 crc kubenswrapper[4937]: I0225 15:54:59.314463 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" podStartSLOduration=5.314444025 podStartE2EDuration="5.314444025s" podCreationTimestamp="2026-02-25 15:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:54:59.311140892 +0000 UTC m=+550.324532802" watchObservedRunningTime="2026-02-25 15:54:59.314444025 +0000 UTC m=+550.327835915" Feb 25 15:55:05 crc kubenswrapper[4937]: I0225 15:55:05.998441 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrgxt"] Feb 25 15:55:06 crc kubenswrapper[4937]: I0225 15:55:05.999237 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mrgxt" podUID="3c753535-03f4-4888-8e28-43b4924726ae" containerName="registry-server" containerID="cri-o://447b39b9ecee8c2bc457edcffb99f49b31b65646338fb309426fd252f5c8d027" gracePeriod=2 Feb 25 15:55:06 crc kubenswrapper[4937]: I0225 15:55:06.355377 4937 generic.go:334] "Generic (PLEG): container finished" podID="3c753535-03f4-4888-8e28-43b4924726ae" containerID="447b39b9ecee8c2bc457edcffb99f49b31b65646338fb309426fd252f5c8d027" exitCode=0 Feb 25 15:55:06 crc kubenswrapper[4937]: I0225 15:55:06.355421 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrgxt" event={"ID":"3c753535-03f4-4888-8e28-43b4924726ae","Type":"ContainerDied","Data":"447b39b9ecee8c2bc457edcffb99f49b31b65646338fb309426fd252f5c8d027"} Feb 25 15:55:06 crc kubenswrapper[4937]: I0225 15:55:06.479635 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:55:06 crc kubenswrapper[4937]: I0225 15:55:06.572669 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jjdr\" (UniqueName: \"kubernetes.io/projected/3c753535-03f4-4888-8e28-43b4924726ae-kube-api-access-2jjdr\") pod \"3c753535-03f4-4888-8e28-43b4924726ae\" (UID: \"3c753535-03f4-4888-8e28-43b4924726ae\") " Feb 25 15:55:06 crc kubenswrapper[4937]: I0225 15:55:06.572743 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c753535-03f4-4888-8e28-43b4924726ae-utilities\") pod \"3c753535-03f4-4888-8e28-43b4924726ae\" (UID: \"3c753535-03f4-4888-8e28-43b4924726ae\") " Feb 25 15:55:06 crc kubenswrapper[4937]: I0225 15:55:06.572767 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c753535-03f4-4888-8e28-43b4924726ae-catalog-content\") pod \"3c753535-03f4-4888-8e28-43b4924726ae\" (UID: \"3c753535-03f4-4888-8e28-43b4924726ae\") " Feb 25 15:55:06 crc kubenswrapper[4937]: I0225 15:55:06.573643 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c753535-03f4-4888-8e28-43b4924726ae-utilities" (OuterVolumeSpecName: "utilities") pod "3c753535-03f4-4888-8e28-43b4924726ae" (UID: "3c753535-03f4-4888-8e28-43b4924726ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:55:06 crc kubenswrapper[4937]: I0225 15:55:06.578478 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c753535-03f4-4888-8e28-43b4924726ae-kube-api-access-2jjdr" (OuterVolumeSpecName: "kube-api-access-2jjdr") pod "3c753535-03f4-4888-8e28-43b4924726ae" (UID: "3c753535-03f4-4888-8e28-43b4924726ae"). InnerVolumeSpecName "kube-api-access-2jjdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:55:06 crc kubenswrapper[4937]: I0225 15:55:06.607288 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c753535-03f4-4888-8e28-43b4924726ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c753535-03f4-4888-8e28-43b4924726ae" (UID: "3c753535-03f4-4888-8e28-43b4924726ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:55:06 crc kubenswrapper[4937]: I0225 15:55:06.674292 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jjdr\" (UniqueName: \"kubernetes.io/projected/3c753535-03f4-4888-8e28-43b4924726ae-kube-api-access-2jjdr\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:06 crc kubenswrapper[4937]: I0225 15:55:06.674333 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c753535-03f4-4888-8e28-43b4924726ae-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:06 crc kubenswrapper[4937]: I0225 15:55:06.674344 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c753535-03f4-4888-8e28-43b4924726ae-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.003417 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l9tlm"] Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.003741 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l9tlm" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" containerName="registry-server" containerID="cri-o://fc6791fd338d3867814b729210187a036b40162992c11d395455abbba0bd0686" gracePeriod=2 Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.362914 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mrgxt" Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.362920 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrgxt" event={"ID":"3c753535-03f4-4888-8e28-43b4924726ae","Type":"ContainerDied","Data":"cd824703907873b1cc669f0f863cfefa256aecd170d5d4b8daaed22f9a84a8d2"} Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.363376 4937 scope.go:117] "RemoveContainer" containerID="447b39b9ecee8c2bc457edcffb99f49b31b65646338fb309426fd252f5c8d027" Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.365274 4937 generic.go:334] "Generic (PLEG): container finished" podID="fae7d336-b701-4174-bdae-bd3f1bc032b1" containerID="fc6791fd338d3867814b729210187a036b40162992c11d395455abbba0bd0686" exitCode=0 Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.365299 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9tlm" event={"ID":"fae7d336-b701-4174-bdae-bd3f1bc032b1","Type":"ContainerDied","Data":"fc6791fd338d3867814b729210187a036b40162992c11d395455abbba0bd0686"} Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.378697 4937 scope.go:117] "RemoveContainer" containerID="acc0809d2132a187122f0db7ee87b36ce597ec52eb14bf66aa6cc98b7afd7dbc" Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.398808 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrgxt"] Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.401960 4937 scope.go:117] "RemoveContainer" containerID="50afffbc2a0c0f468eb8f02c16a9bfebf9b9c4158782ac959bac67a586586f9c" Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.402894 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrgxt"] Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.458757 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.585458 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4wwh\" (UniqueName: \"kubernetes.io/projected/fae7d336-b701-4174-bdae-bd3f1bc032b1-kube-api-access-b4wwh\") pod \"fae7d336-b701-4174-bdae-bd3f1bc032b1\" (UID: \"fae7d336-b701-4174-bdae-bd3f1bc032b1\") " Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.585537 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fae7d336-b701-4174-bdae-bd3f1bc032b1-utilities\") pod \"fae7d336-b701-4174-bdae-bd3f1bc032b1\" (UID: \"fae7d336-b701-4174-bdae-bd3f1bc032b1\") " Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.585585 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fae7d336-b701-4174-bdae-bd3f1bc032b1-catalog-content\") pod \"fae7d336-b701-4174-bdae-bd3f1bc032b1\" (UID: \"fae7d336-b701-4174-bdae-bd3f1bc032b1\") " Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.589449 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae7d336-b701-4174-bdae-bd3f1bc032b1-kube-api-access-b4wwh" (OuterVolumeSpecName: "kube-api-access-b4wwh") pod "fae7d336-b701-4174-bdae-bd3f1bc032b1" (UID: "fae7d336-b701-4174-bdae-bd3f1bc032b1"). InnerVolumeSpecName "kube-api-access-b4wwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.601165 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fae7d336-b701-4174-bdae-bd3f1bc032b1-utilities" (OuterVolumeSpecName: "utilities") pod "fae7d336-b701-4174-bdae-bd3f1bc032b1" (UID: "fae7d336-b701-4174-bdae-bd3f1bc032b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.687460 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4wwh\" (UniqueName: \"kubernetes.io/projected/fae7d336-b701-4174-bdae-bd3f1bc032b1-kube-api-access-b4wwh\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.687541 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fae7d336-b701-4174-bdae-bd3f1bc032b1-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.717405 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fae7d336-b701-4174-bdae-bd3f1bc032b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fae7d336-b701-4174-bdae-bd3f1bc032b1" (UID: "fae7d336-b701-4174-bdae-bd3f1bc032b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:55:07 crc kubenswrapper[4937]: I0225 15:55:07.788796 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fae7d336-b701-4174-bdae-bd3f1bc032b1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:08 crc kubenswrapper[4937]: I0225 15:55:08.376383 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9tlm" event={"ID":"fae7d336-b701-4174-bdae-bd3f1bc032b1","Type":"ContainerDied","Data":"038aefc5b55abeda87c84981b6e0dcc15d183b49a8313d7edb8541461b4cafbe"} Feb 25 15:55:08 crc kubenswrapper[4937]: I0225 15:55:08.376427 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9tlm" Feb 25 15:55:08 crc kubenswrapper[4937]: I0225 15:55:08.376455 4937 scope.go:117] "RemoveContainer" containerID="fc6791fd338d3867814b729210187a036b40162992c11d395455abbba0bd0686" Feb 25 15:55:08 crc kubenswrapper[4937]: I0225 15:55:08.398790 4937 scope.go:117] "RemoveContainer" containerID="088f18cd4ff4698383b7079b3ab76ce7b26f4672d20851c04010cccfc4589231" Feb 25 15:55:08 crc kubenswrapper[4937]: I0225 15:55:08.414981 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l9tlm"] Feb 25 15:55:08 crc kubenswrapper[4937]: I0225 15:55:08.421820 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l9tlm"] Feb 25 15:55:08 crc kubenswrapper[4937]: I0225 15:55:08.443977 4937 scope.go:117] "RemoveContainer" containerID="e29ca1b0226e4224c03df32df7ca5063937ef4b2b1d7b6b1b4f6a281aa2aaafe" Feb 25 15:55:09 crc kubenswrapper[4937]: I0225 15:55:09.380816 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c753535-03f4-4888-8e28-43b4924726ae" path="/var/lib/kubelet/pods/3c753535-03f4-4888-8e28-43b4924726ae/volumes" Feb 25 15:55:09 crc kubenswrapper[4937]: I0225 15:55:09.382405 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" path="/var/lib/kubelet/pods/fae7d336-b701-4174-bdae-bd3f1bc032b1/volumes" Feb 25 15:55:14 crc kubenswrapper[4937]: I0225 15:55:14.330117 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-758cc85675-frsf8"] Feb 25 15:55:14 crc kubenswrapper[4937]: I0225 15:55:14.330754 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" podUID="abcbcf49-966c-4490-bd42-d89a995f92cf" containerName="controller-manager" containerID="cri-o://a8ba15a999b4dd4ec4664cef9e9533cb271d41759b42d3be89e608eb12273e14" gracePeriod=30 Feb 25 15:55:14 crc kubenswrapper[4937]: I0225 15:55:14.473424 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86"] Feb 25 15:55:14 crc kubenswrapper[4937]: I0225 15:55:14.473891 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" podUID="3d161886-aca0-4216-8cd9-3129713ea236" containerName="route-controller-manager" containerID="cri-o://f0ff71a6cc28d47eec2cb71cbb31dac18d9633056b25de8548507d2957c6a20a" gracePeriod=30 Feb 25 15:55:14 crc kubenswrapper[4937]: I0225 15:55:14.940315 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.080811 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48zvl\" (UniqueName: \"kubernetes.io/projected/3d161886-aca0-4216-8cd9-3129713ea236-kube-api-access-48zvl\") pod \"3d161886-aca0-4216-8cd9-3129713ea236\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.080897 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d161886-aca0-4216-8cd9-3129713ea236-client-ca\") pod \"3d161886-aca0-4216-8cd9-3129713ea236\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.081119 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d161886-aca0-4216-8cd9-3129713ea236-config\") pod \"3d161886-aca0-4216-8cd9-3129713ea236\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.081170 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d161886-aca0-4216-8cd9-3129713ea236-serving-cert\") pod \"3d161886-aca0-4216-8cd9-3129713ea236\" (UID: \"3d161886-aca0-4216-8cd9-3129713ea236\") " Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.081703 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d161886-aca0-4216-8cd9-3129713ea236-client-ca" (OuterVolumeSpecName: "client-ca") pod "3d161886-aca0-4216-8cd9-3129713ea236" (UID: "3d161886-aca0-4216-8cd9-3129713ea236"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.081794 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d161886-aca0-4216-8cd9-3129713ea236-config" (OuterVolumeSpecName: "config") pod "3d161886-aca0-4216-8cd9-3129713ea236" (UID: "3d161886-aca0-4216-8cd9-3129713ea236"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.087113 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d161886-aca0-4216-8cd9-3129713ea236-kube-api-access-48zvl" (OuterVolumeSpecName: "kube-api-access-48zvl") pod "3d161886-aca0-4216-8cd9-3129713ea236" (UID: "3d161886-aca0-4216-8cd9-3129713ea236"). InnerVolumeSpecName "kube-api-access-48zvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.090682 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d161886-aca0-4216-8cd9-3129713ea236-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3d161886-aca0-4216-8cd9-3129713ea236" (UID: "3d161886-aca0-4216-8cd9-3129713ea236"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.182772 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d161886-aca0-4216-8cd9-3129713ea236-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.182833 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d161886-aca0-4216-8cd9-3129713ea236-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.182859 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48zvl\" (UniqueName: \"kubernetes.io/projected/3d161886-aca0-4216-8cd9-3129713ea236-kube-api-access-48zvl\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.182883 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d161886-aca0-4216-8cd9-3129713ea236-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.405863 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.477931 4937 generic.go:334] "Generic (PLEG): container finished" podID="3d161886-aca0-4216-8cd9-3129713ea236" containerID="f0ff71a6cc28d47eec2cb71cbb31dac18d9633056b25de8548507d2957c6a20a" exitCode=0 Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.478015 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.478021 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" event={"ID":"3d161886-aca0-4216-8cd9-3129713ea236","Type":"ContainerDied","Data":"f0ff71a6cc28d47eec2cb71cbb31dac18d9633056b25de8548507d2957c6a20a"} Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.478145 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86" event={"ID":"3d161886-aca0-4216-8cd9-3129713ea236","Type":"ContainerDied","Data":"e5d6eb841663bca7cdb0539af56fd6be594c0b86acb40ac4c0710ae40b885509"} Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.478167 4937 scope.go:117] "RemoveContainer" containerID="f0ff71a6cc28d47eec2cb71cbb31dac18d9633056b25de8548507d2957c6a20a" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.480276 4937 generic.go:334] "Generic (PLEG): container finished" podID="abcbcf49-966c-4490-bd42-d89a995f92cf" containerID="a8ba15a999b4dd4ec4664cef9e9533cb271d41759b42d3be89e608eb12273e14" exitCode=0 Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.480314 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" event={"ID":"abcbcf49-966c-4490-bd42-d89a995f92cf","Type":"ContainerDied","Data":"a8ba15a999b4dd4ec4664cef9e9533cb271d41759b42d3be89e608eb12273e14"} Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.480341 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" event={"ID":"abcbcf49-966c-4490-bd42-d89a995f92cf","Type":"ContainerDied","Data":"3d607f6964459fd9aa14a211b4a10545acfa936b01e52f1d1d812ccd02cf6b52"} Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.480542 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-758cc85675-frsf8" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.486944 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-config\") pod \"abcbcf49-966c-4490-bd42-d89a995f92cf\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.487024 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-proxy-ca-bundles\") pod \"abcbcf49-966c-4490-bd42-d89a995f92cf\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.487085 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-client-ca\") pod \"abcbcf49-966c-4490-bd42-d89a995f92cf\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.487154 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abcbcf49-966c-4490-bd42-d89a995f92cf-serving-cert\") pod \"abcbcf49-966c-4490-bd42-d89a995f92cf\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.487203 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2vln\" (UniqueName: \"kubernetes.io/projected/abcbcf49-966c-4490-bd42-d89a995f92cf-kube-api-access-c2vln\") pod \"abcbcf49-966c-4490-bd42-d89a995f92cf\" (UID: \"abcbcf49-966c-4490-bd42-d89a995f92cf\") " Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.487724 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "abcbcf49-966c-4490-bd42-d89a995f92cf" (UID: "abcbcf49-966c-4490-bd42-d89a995f92cf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.487842 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-config" (OuterVolumeSpecName: "config") pod "abcbcf49-966c-4490-bd42-d89a995f92cf" (UID: "abcbcf49-966c-4490-bd42-d89a995f92cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.487930 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-client-ca" (OuterVolumeSpecName: "client-ca") pod "abcbcf49-966c-4490-bd42-d89a995f92cf" (UID: "abcbcf49-966c-4490-bd42-d89a995f92cf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.492133 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abcbcf49-966c-4490-bd42-d89a995f92cf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "abcbcf49-966c-4490-bd42-d89a995f92cf" (UID: "abcbcf49-966c-4490-bd42-d89a995f92cf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.492459 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abcbcf49-966c-4490-bd42-d89a995f92cf-kube-api-access-c2vln" (OuterVolumeSpecName: "kube-api-access-c2vln") pod "abcbcf49-966c-4490-bd42-d89a995f92cf" (UID: "abcbcf49-966c-4490-bd42-d89a995f92cf"). InnerVolumeSpecName "kube-api-access-c2vln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.498717 4937 scope.go:117] "RemoveContainer" containerID="f0ff71a6cc28d47eec2cb71cbb31dac18d9633056b25de8548507d2957c6a20a" Feb 25 15:55:15 crc kubenswrapper[4937]: E0225 15:55:15.499383 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ff71a6cc28d47eec2cb71cbb31dac18d9633056b25de8548507d2957c6a20a\": container with ID starting with f0ff71a6cc28d47eec2cb71cbb31dac18d9633056b25de8548507d2957c6a20a not found: ID does not exist" containerID="f0ff71a6cc28d47eec2cb71cbb31dac18d9633056b25de8548507d2957c6a20a" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.499455 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ff71a6cc28d47eec2cb71cbb31dac18d9633056b25de8548507d2957c6a20a"} err="failed to get container status \"f0ff71a6cc28d47eec2cb71cbb31dac18d9633056b25de8548507d2957c6a20a\": rpc error: code = NotFound desc = could not find container \"f0ff71a6cc28d47eec2cb71cbb31dac18d9633056b25de8548507d2957c6a20a\": container with ID starting with f0ff71a6cc28d47eec2cb71cbb31dac18d9633056b25de8548507d2957c6a20a not found: ID does not exist" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.499540 4937 scope.go:117] "RemoveContainer" containerID="a8ba15a999b4dd4ec4664cef9e9533cb271d41759b42d3be89e608eb12273e14" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.500684 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86"] Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.504411 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dcfb5d85-9xl86"] Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.539734 4937 scope.go:117] "RemoveContainer" containerID="a8ba15a999b4dd4ec4664cef9e9533cb271d41759b42d3be89e608eb12273e14" Feb 25 15:55:15 crc kubenswrapper[4937]: E0225 15:55:15.540190 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ba15a999b4dd4ec4664cef9e9533cb271d41759b42d3be89e608eb12273e14\": container with ID starting with a8ba15a999b4dd4ec4664cef9e9533cb271d41759b42d3be89e608eb12273e14 not found: ID does not exist" containerID="a8ba15a999b4dd4ec4664cef9e9533cb271d41759b42d3be89e608eb12273e14" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.540232 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ba15a999b4dd4ec4664cef9e9533cb271d41759b42d3be89e608eb12273e14"} err="failed to get container status \"a8ba15a999b4dd4ec4664cef9e9533cb271d41759b42d3be89e608eb12273e14\": rpc error: code = NotFound desc = could not find container \"a8ba15a999b4dd4ec4664cef9e9533cb271d41759b42d3be89e608eb12273e14\": container with ID starting with a8ba15a999b4dd4ec4664cef9e9533cb271d41759b42d3be89e608eb12273e14 not found: ID does not exist" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.589030 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.589073 4937 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.589088 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/abcbcf49-966c-4490-bd42-d89a995f92cf-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.589101 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abcbcf49-966c-4490-bd42-d89a995f92cf-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.589114 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2vln\" (UniqueName: \"kubernetes.io/projected/abcbcf49-966c-4490-bd42-d89a995f92cf-kube-api-access-c2vln\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.826998 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-758cc85675-frsf8"] Feb 25 15:55:15 crc kubenswrapper[4937]: I0225 15:55:15.830896 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-758cc85675-frsf8"] Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.283683 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-649879fd68-7qljz"] Feb 25 15:55:16 crc kubenswrapper[4937]: E0225 15:55:16.284180 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" containerName="extract-content" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.284228 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" containerName="extract-content" Feb 25 15:55:16 crc kubenswrapper[4937]: E0225 15:55:16.284248 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abcbcf49-966c-4490-bd42-d89a995f92cf" containerName="controller-manager" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.284266 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="abcbcf49-966c-4490-bd42-d89a995f92cf" containerName="controller-manager" Feb 25 15:55:16 crc kubenswrapper[4937]: E0225 15:55:16.284294 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c753535-03f4-4888-8e28-43b4924726ae" containerName="registry-server" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.284312 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c753535-03f4-4888-8e28-43b4924726ae" containerName="registry-server" Feb 25 15:55:16 crc kubenswrapper[4937]: E0225 15:55:16.284339 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" containerName="registry-server" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.284356 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" containerName="registry-server" Feb 25 15:55:16 crc kubenswrapper[4937]: E0225 15:55:16.284381 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c753535-03f4-4888-8e28-43b4924726ae" containerName="extract-content" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.284399 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c753535-03f4-4888-8e28-43b4924726ae" containerName="extract-content" Feb 25 15:55:16 crc kubenswrapper[4937]: E0225 15:55:16.284434 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" containerName="extract-utilities" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.284450 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" containerName="extract-utilities" Feb 25 15:55:16 crc kubenswrapper[4937]: E0225 15:55:16.284472 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d161886-aca0-4216-8cd9-3129713ea236" containerName="route-controller-manager" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.284523 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d161886-aca0-4216-8cd9-3129713ea236" containerName="route-controller-manager" Feb 25 15:55:16 crc kubenswrapper[4937]: E0225 15:55:16.284550 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c753535-03f4-4888-8e28-43b4924726ae" containerName="extract-utilities" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.284566 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c753535-03f4-4888-8e28-43b4924726ae" containerName="extract-utilities" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.284803 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae7d336-b701-4174-bdae-bd3f1bc032b1" containerName="registry-server" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.284849 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d161886-aca0-4216-8cd9-3129713ea236" containerName="route-controller-manager" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.284889 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c753535-03f4-4888-8e28-43b4924726ae" containerName="registry-server" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.284908 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="abcbcf49-966c-4490-bd42-d89a995f92cf" containerName="controller-manager" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.285750 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.288295 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24"] Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.289139 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.289209 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.290241 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.290545 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.296561 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.297274 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.297479 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.297566 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.297752 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.300668 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.301826 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.301960 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.303088 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.311741 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.314204 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-649879fd68-7qljz"] Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.319265 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24"] Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.398654 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ec4957-2356-47fa-bcd1-12124a63f06d-serving-cert\") pod \"route-controller-manager-944dcdc54-dqw24\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.398797 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg6fz\" (UniqueName: \"kubernetes.io/projected/32ec4957-2356-47fa-bcd1-12124a63f06d-kube-api-access-xg6fz\") pod \"route-controller-manager-944dcdc54-dqw24\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.398858 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-config\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.398904 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ec4957-2356-47fa-bcd1-12124a63f06d-config\") pod \"route-controller-manager-944dcdc54-dqw24\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.398954 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0bfaaf2-3982-4625-9c54-654ec76689c5-serving-cert\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.398984 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-client-ca\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.399012 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nw4v\" (UniqueName: \"kubernetes.io/projected/e0bfaaf2-3982-4625-9c54-654ec76689c5-kube-api-access-7nw4v\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.399112 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32ec4957-2356-47fa-bcd1-12124a63f06d-client-ca\") pod \"route-controller-manager-944dcdc54-dqw24\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.399145 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-proxy-ca-bundles\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.500543 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-config\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.500622 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ec4957-2356-47fa-bcd1-12124a63f06d-config\") pod \"route-controller-manager-944dcdc54-dqw24\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.500681 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0bfaaf2-3982-4625-9c54-654ec76689c5-serving-cert\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.500724 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-client-ca\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.500765 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nw4v\" (UniqueName: \"kubernetes.io/projected/e0bfaaf2-3982-4625-9c54-654ec76689c5-kube-api-access-7nw4v\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.500834 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32ec4957-2356-47fa-bcd1-12124a63f06d-client-ca\") pod \"route-controller-manager-944dcdc54-dqw24\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.500882 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-proxy-ca-bundles\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.500942 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ec4957-2356-47fa-bcd1-12124a63f06d-serving-cert\") pod \"route-controller-manager-944dcdc54-dqw24\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.501015 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg6fz\" (UniqueName: \"kubernetes.io/projected/32ec4957-2356-47fa-bcd1-12124a63f06d-kube-api-access-xg6fz\") pod \"route-controller-manager-944dcdc54-dqw24\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.502056 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-client-ca\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.503049 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ec4957-2356-47fa-bcd1-12124a63f06d-config\") pod \"route-controller-manager-944dcdc54-dqw24\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.503316 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32ec4957-2356-47fa-bcd1-12124a63f06d-client-ca\") pod \"route-controller-manager-944dcdc54-dqw24\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.503637 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-proxy-ca-bundles\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.504711 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-config\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.506285 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0bfaaf2-3982-4625-9c54-654ec76689c5-serving-cert\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.513370 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ec4957-2356-47fa-bcd1-12124a63f06d-serving-cert\") pod \"route-controller-manager-944dcdc54-dqw24\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.522751 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nw4v\" (UniqueName: \"kubernetes.io/projected/e0bfaaf2-3982-4625-9c54-654ec76689c5-kube-api-access-7nw4v\") pod \"controller-manager-649879fd68-7qljz\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.525172 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg6fz\" (UniqueName: \"kubernetes.io/projected/32ec4957-2356-47fa-bcd1-12124a63f06d-kube-api-access-xg6fz\") pod \"route-controller-manager-944dcdc54-dqw24\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.615972 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:16 crc kubenswrapper[4937]: I0225 15:55:16.627590 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.067602 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-649879fd68-7qljz"] Feb 25 15:55:17 crc kubenswrapper[4937]: W0225 15:55:17.077722 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0bfaaf2_3982_4625_9c54_654ec76689c5.slice/crio-b29cec98d93aefc0c7e61df709fe68dd13f93cf6b9c74246b0195f4dec011045 WatchSource:0}: Error finding container b29cec98d93aefc0c7e61df709fe68dd13f93cf6b9c74246b0195f4dec011045: Status 404 returned error can't find the container with id b29cec98d93aefc0c7e61df709fe68dd13f93cf6b9c74246b0195f4dec011045 Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.148989 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24"] Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.376287 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d161886-aca0-4216-8cd9-3129713ea236" path="/var/lib/kubelet/pods/3d161886-aca0-4216-8cd9-3129713ea236/volumes" Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.376931 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abcbcf49-966c-4490-bd42-d89a995f92cf" path="/var/lib/kubelet/pods/abcbcf49-966c-4490-bd42-d89a995f92cf/volumes" Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.498429 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" event={"ID":"32ec4957-2356-47fa-bcd1-12124a63f06d","Type":"ContainerStarted","Data":"eaa309096de85e4c9b8b621e3ec67f99dfb7546b808fe76f3f780e7834fc9336"} Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.498508 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.498524 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" event={"ID":"32ec4957-2356-47fa-bcd1-12124a63f06d","Type":"ContainerStarted","Data":"65522740cd1ab17a0d78aff64d63dc8b4d8b8a49380bf121b8f9081cdbf5b241"} Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.499872 4937 patch_prober.go:28] interesting pod/route-controller-manager-944dcdc54-dqw24 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": dial tcp 10.217.0.72:8443: connect: connection refused" start-of-body= Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.499912 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" podUID="32ec4957-2356-47fa-bcd1-12124a63f06d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": dial tcp 10.217.0.72:8443: connect: connection refused" Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.500117 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" event={"ID":"e0bfaaf2-3982-4625-9c54-654ec76689c5","Type":"ContainerStarted","Data":"d02272d5dacca8d6d6512f2eefd9553ee59971ede100b60ffd9bf837ae1a5408"} Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.500174 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" event={"ID":"e0bfaaf2-3982-4625-9c54-654ec76689c5","Type":"ContainerStarted","Data":"b29cec98d93aefc0c7e61df709fe68dd13f93cf6b9c74246b0195f4dec011045"} Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.500327 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.504275 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.520992 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" podStartSLOduration=3.520970444 podStartE2EDuration="3.520970444s" podCreationTimestamp="2026-02-25 15:55:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:55:17.518732458 +0000 UTC m=+568.532124338" watchObservedRunningTime="2026-02-25 15:55:17.520970444 +0000 UTC m=+568.534362334" Feb 25 15:55:17 crc kubenswrapper[4937]: I0225 15:55:17.541194 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" podStartSLOduration=3.541176302 podStartE2EDuration="3.541176302s" podCreationTimestamp="2026-02-25 15:55:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:55:17.540455034 +0000 UTC m=+568.553846934" watchObservedRunningTime="2026-02-25 15:55:17.541176302 +0000 UTC m=+568.554568192" Feb 25 15:55:18 crc kubenswrapper[4937]: I0225 15:55:18.513733 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:34 crc kubenswrapper[4937]: I0225 15:55:34.348915 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-649879fd68-7qljz"] Feb 25 15:55:34 crc kubenswrapper[4937]: I0225 15:55:34.349965 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" podUID="e0bfaaf2-3982-4625-9c54-654ec76689c5" containerName="controller-manager" containerID="cri-o://d02272d5dacca8d6d6512f2eefd9553ee59971ede100b60ffd9bf837ae1a5408" gracePeriod=30 Feb 25 15:55:34 crc kubenswrapper[4937]: I0225 15:55:34.353285 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24"] Feb 25 15:55:34 crc kubenswrapper[4937]: I0225 15:55:34.353536 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" podUID="32ec4957-2356-47fa-bcd1-12124a63f06d" containerName="route-controller-manager" containerID="cri-o://eaa309096de85e4c9b8b621e3ec67f99dfb7546b808fe76f3f780e7834fc9336" gracePeriod=30 Feb 25 15:55:34 crc kubenswrapper[4937]: I0225 15:55:34.630254 4937 generic.go:334] "Generic (PLEG): container finished" podID="e0bfaaf2-3982-4625-9c54-654ec76689c5" containerID="d02272d5dacca8d6d6512f2eefd9553ee59971ede100b60ffd9bf837ae1a5408" exitCode=0 Feb 25 15:55:34 crc kubenswrapper[4937]: I0225 15:55:34.630387 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" event={"ID":"e0bfaaf2-3982-4625-9c54-654ec76689c5","Type":"ContainerDied","Data":"d02272d5dacca8d6d6512f2eefd9553ee59971ede100b60ffd9bf837ae1a5408"} Feb 25 15:55:34 crc kubenswrapper[4937]: I0225 15:55:34.631556 4937 generic.go:334] "Generic (PLEG): container finished" podID="32ec4957-2356-47fa-bcd1-12124a63f06d" containerID="eaa309096de85e4c9b8b621e3ec67f99dfb7546b808fe76f3f780e7834fc9336" exitCode=0 Feb 25 15:55:34 crc kubenswrapper[4937]: I0225 15:55:34.631588 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" event={"ID":"32ec4957-2356-47fa-bcd1-12124a63f06d","Type":"ContainerDied","Data":"eaa309096de85e4c9b8b621e3ec67f99dfb7546b808fe76f3f780e7834fc9336"} Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.447383 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.481678 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d79766549-sd626"] Feb 25 15:55:35 crc kubenswrapper[4937]: E0225 15:55:35.482394 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ec4957-2356-47fa-bcd1-12124a63f06d" containerName="route-controller-manager" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.482406 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ec4957-2356-47fa-bcd1-12124a63f06d" containerName="route-controller-manager" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.482636 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ec4957-2356-47fa-bcd1-12124a63f06d" containerName="route-controller-manager" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.483324 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.496317 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d79766549-sd626"] Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.522200 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.575532 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg6fz\" (UniqueName: \"kubernetes.io/projected/32ec4957-2356-47fa-bcd1-12124a63f06d-kube-api-access-xg6fz\") pod \"32ec4957-2356-47fa-bcd1-12124a63f06d\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.575604 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ec4957-2356-47fa-bcd1-12124a63f06d-config\") pod \"32ec4957-2356-47fa-bcd1-12124a63f06d\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.575849 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ec4957-2356-47fa-bcd1-12124a63f06d-serving-cert\") pod \"32ec4957-2356-47fa-bcd1-12124a63f06d\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.575907 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32ec4957-2356-47fa-bcd1-12124a63f06d-client-ca\") pod \"32ec4957-2356-47fa-bcd1-12124a63f06d\" (UID: \"32ec4957-2356-47fa-bcd1-12124a63f06d\") " Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.576098 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d62779-8acc-4cf5-99f0-a3389d1af7cd-config\") pod \"route-controller-manager-6d79766549-sd626\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.576140 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54d62779-8acc-4cf5-99f0-a3389d1af7cd-client-ca\") pod \"route-controller-manager-6d79766549-sd626\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.576189 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54d62779-8acc-4cf5-99f0-a3389d1af7cd-serving-cert\") pod \"route-controller-manager-6d79766549-sd626\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.576233 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjslg\" (UniqueName: \"kubernetes.io/projected/54d62779-8acc-4cf5-99f0-a3389d1af7cd-kube-api-access-sjslg\") pod \"route-controller-manager-6d79766549-sd626\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.576732 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32ec4957-2356-47fa-bcd1-12124a63f06d-client-ca" (OuterVolumeSpecName: "client-ca") pod "32ec4957-2356-47fa-bcd1-12124a63f06d" (UID: "32ec4957-2356-47fa-bcd1-12124a63f06d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.577151 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32ec4957-2356-47fa-bcd1-12124a63f06d-config" (OuterVolumeSpecName: "config") pod "32ec4957-2356-47fa-bcd1-12124a63f06d" (UID: "32ec4957-2356-47fa-bcd1-12124a63f06d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.581455 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ec4957-2356-47fa-bcd1-12124a63f06d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "32ec4957-2356-47fa-bcd1-12124a63f06d" (UID: "32ec4957-2356-47fa-bcd1-12124a63f06d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.581853 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32ec4957-2356-47fa-bcd1-12124a63f06d-kube-api-access-xg6fz" (OuterVolumeSpecName: "kube-api-access-xg6fz") pod "32ec4957-2356-47fa-bcd1-12124a63f06d" (UID: "32ec4957-2356-47fa-bcd1-12124a63f06d"). InnerVolumeSpecName "kube-api-access-xg6fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.638105 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" event={"ID":"e0bfaaf2-3982-4625-9c54-654ec76689c5","Type":"ContainerDied","Data":"b29cec98d93aefc0c7e61df709fe68dd13f93cf6b9c74246b0195f4dec011045"} Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.638163 4937 scope.go:117] "RemoveContainer" containerID="d02272d5dacca8d6d6512f2eefd9553ee59971ede100b60ffd9bf837ae1a5408" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.638490 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-649879fd68-7qljz" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.640578 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.640597 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24" event={"ID":"32ec4957-2356-47fa-bcd1-12124a63f06d","Type":"ContainerDied","Data":"65522740cd1ab17a0d78aff64d63dc8b4d8b8a49380bf121b8f9081cdbf5b241"} Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.657501 4937 scope.go:117] "RemoveContainer" containerID="eaa309096de85e4c9b8b621e3ec67f99dfb7546b808fe76f3f780e7834fc9336" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.672916 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24"] Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.676428 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-944dcdc54-dqw24"] Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.676974 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-proxy-ca-bundles\") pod \"e0bfaaf2-3982-4625-9c54-654ec76689c5\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.677074 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-client-ca\") pod \"e0bfaaf2-3982-4625-9c54-654ec76689c5\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.677147 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-config\") pod \"e0bfaaf2-3982-4625-9c54-654ec76689c5\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.677181 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0bfaaf2-3982-4625-9c54-654ec76689c5-serving-cert\") pod \"e0bfaaf2-3982-4625-9c54-654ec76689c5\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.677228 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nw4v\" (UniqueName: \"kubernetes.io/projected/e0bfaaf2-3982-4625-9c54-654ec76689c5-kube-api-access-7nw4v\") pod \"e0bfaaf2-3982-4625-9c54-654ec76689c5\" (UID: \"e0bfaaf2-3982-4625-9c54-654ec76689c5\") " Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.677398 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d62779-8acc-4cf5-99f0-a3389d1af7cd-config\") pod \"route-controller-manager-6d79766549-sd626\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.677437 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54d62779-8acc-4cf5-99f0-a3389d1af7cd-client-ca\") pod \"route-controller-manager-6d79766549-sd626\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.677486 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54d62779-8acc-4cf5-99f0-a3389d1af7cd-serving-cert\") pod \"route-controller-manager-6d79766549-sd626\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.677549 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjslg\" (UniqueName: \"kubernetes.io/projected/54d62779-8acc-4cf5-99f0-a3389d1af7cd-kube-api-access-sjslg\") pod \"route-controller-manager-6d79766549-sd626\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.677596 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32ec4957-2356-47fa-bcd1-12124a63f06d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.677612 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg6fz\" (UniqueName: \"kubernetes.io/projected/32ec4957-2356-47fa-bcd1-12124a63f06d-kube-api-access-xg6fz\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.677625 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ec4957-2356-47fa-bcd1-12124a63f06d-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.677636 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ec4957-2356-47fa-bcd1-12124a63f06d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.677872 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-client-ca" (OuterVolumeSpecName: "client-ca") pod "e0bfaaf2-3982-4625-9c54-654ec76689c5" (UID: "e0bfaaf2-3982-4625-9c54-654ec76689c5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.677937 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e0bfaaf2-3982-4625-9c54-654ec76689c5" (UID: "e0bfaaf2-3982-4625-9c54-654ec76689c5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.678421 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-config" (OuterVolumeSpecName: "config") pod "e0bfaaf2-3982-4625-9c54-654ec76689c5" (UID: "e0bfaaf2-3982-4625-9c54-654ec76689c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.678562 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54d62779-8acc-4cf5-99f0-a3389d1af7cd-client-ca\") pod \"route-controller-manager-6d79766549-sd626\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.679879 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0bfaaf2-3982-4625-9c54-654ec76689c5-kube-api-access-7nw4v" (OuterVolumeSpecName: "kube-api-access-7nw4v") pod "e0bfaaf2-3982-4625-9c54-654ec76689c5" (UID: "e0bfaaf2-3982-4625-9c54-654ec76689c5"). InnerVolumeSpecName "kube-api-access-7nw4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.680445 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d62779-8acc-4cf5-99f0-a3389d1af7cd-config\") pod \"route-controller-manager-6d79766549-sd626\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.681479 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54d62779-8acc-4cf5-99f0-a3389d1af7cd-serving-cert\") pod \"route-controller-manager-6d79766549-sd626\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.681649 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0bfaaf2-3982-4625-9c54-654ec76689c5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e0bfaaf2-3982-4625-9c54-654ec76689c5" (UID: "e0bfaaf2-3982-4625-9c54-654ec76689c5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.691597 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjslg\" (UniqueName: \"kubernetes.io/projected/54d62779-8acc-4cf5-99f0-a3389d1af7cd-kube-api-access-sjslg\") pod \"route-controller-manager-6d79766549-sd626\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.779717 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.779849 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.779869 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0bfaaf2-3982-4625-9c54-654ec76689c5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.779888 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nw4v\" (UniqueName: \"kubernetes.io/projected/e0bfaaf2-3982-4625-9c54-654ec76689c5-kube-api-access-7nw4v\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.779907 4937 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0bfaaf2-3982-4625-9c54-654ec76689c5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.833950 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.991606 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-649879fd68-7qljz"] Feb 25 15:55:35 crc kubenswrapper[4937]: I0225 15:55:35.999085 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-649879fd68-7qljz"] Feb 25 15:55:36 crc kubenswrapper[4937]: I0225 15:55:36.311144 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d79766549-sd626"] Feb 25 15:55:36 crc kubenswrapper[4937]: I0225 15:55:36.653901 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" event={"ID":"54d62779-8acc-4cf5-99f0-a3389d1af7cd","Type":"ContainerStarted","Data":"e638a815875f4b6d1dbf4c5867f120b560f7cf9f4bca8cee0b3ec562c1957b0a"} Feb 25 15:55:36 crc kubenswrapper[4937]: I0225 15:55:36.653980 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" event={"ID":"54d62779-8acc-4cf5-99f0-a3389d1af7cd","Type":"ContainerStarted","Data":"0ef278c0ecf8653de71ed66535b9f66565fe6bb6f2c24c8614bc4ee6cdfddf1b"} Feb 25 15:55:36 crc kubenswrapper[4937]: I0225 15:55:36.654872 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:36 crc kubenswrapper[4937]: I0225 15:55:36.675036 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" podStartSLOduration=2.675016425 podStartE2EDuration="2.675016425s" podCreationTimestamp="2026-02-25 15:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:55:36.671945657 +0000 UTC m=+587.685337547" watchObservedRunningTime="2026-02-25 15:55:36.675016425 +0000 UTC m=+587.688408315" Feb 25 15:55:36 crc kubenswrapper[4937]: I0225 15:55:36.791910 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:37 crc kubenswrapper[4937]: I0225 15:55:37.376862 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ec4957-2356-47fa-bcd1-12124a63f06d" path="/var/lib/kubelet/pods/32ec4957-2356-47fa-bcd1-12124a63f06d/volumes" Feb 25 15:55:37 crc kubenswrapper[4937]: I0225 15:55:37.378830 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0bfaaf2-3982-4625-9c54-654ec76689c5" path="/var/lib/kubelet/pods/e0bfaaf2-3982-4625-9c54-654ec76689c5/volumes" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.306217 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-857cd749c-f8nrr"] Feb 25 15:55:38 crc kubenswrapper[4937]: E0225 15:55:38.306575 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0bfaaf2-3982-4625-9c54-654ec76689c5" containerName="controller-manager" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.306595 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0bfaaf2-3982-4625-9c54-654ec76689c5" containerName="controller-manager" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.306758 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0bfaaf2-3982-4625-9c54-654ec76689c5" containerName="controller-manager" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.307358 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.310263 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.311013 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.311283 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.311612 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.314044 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.314168 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.320614 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.324809 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-857cd749c-f8nrr"] Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.419711 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11c71b93-1dfa-4837-a5e9-00f14115b891-serving-cert\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.419978 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-proxy-ca-bundles\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.420120 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-config\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.420190 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-client-ca\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.420227 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lmtg\" (UniqueName: \"kubernetes.io/projected/11c71b93-1dfa-4837-a5e9-00f14115b891-kube-api-access-2lmtg\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.520880 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-proxy-ca-bundles\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.520952 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-config\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.520989 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-client-ca\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.521019 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lmtg\" (UniqueName: \"kubernetes.io/projected/11c71b93-1dfa-4837-a5e9-00f14115b891-kube-api-access-2lmtg\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.521071 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11c71b93-1dfa-4837-a5e9-00f14115b891-serving-cert\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.522454 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-client-ca\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.522723 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-config\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.522943 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-proxy-ca-bundles\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.530951 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11c71b93-1dfa-4837-a5e9-00f14115b891-serving-cert\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.549534 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lmtg\" (UniqueName: \"kubernetes.io/projected/11c71b93-1dfa-4837-a5e9-00f14115b891-kube-api-access-2lmtg\") pod \"controller-manager-857cd749c-f8nrr\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.679559 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:38 crc kubenswrapper[4937]: I0225 15:55:38.923387 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-857cd749c-f8nrr"] Feb 25 15:55:38 crc kubenswrapper[4937]: W0225 15:55:38.928714 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11c71b93_1dfa_4837_a5e9_00f14115b891.slice/crio-9db2af5ba787db30d6c2f5234097623118dbb1e00a9557cf55456b9665756933 WatchSource:0}: Error finding container 9db2af5ba787db30d6c2f5234097623118dbb1e00a9557cf55456b9665756933: Status 404 returned error can't find the container with id 9db2af5ba787db30d6c2f5234097623118dbb1e00a9557cf55456b9665756933 Feb 25 15:55:39 crc kubenswrapper[4937]: I0225 15:55:39.670515 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" event={"ID":"11c71b93-1dfa-4837-a5e9-00f14115b891","Type":"ContainerStarted","Data":"b2793e87548575b3c29f84c50fa20e490bb4de84eb1b11c039d467a4c673a59f"} Feb 25 15:55:39 crc kubenswrapper[4937]: I0225 15:55:39.670850 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" event={"ID":"11c71b93-1dfa-4837-a5e9-00f14115b891","Type":"ContainerStarted","Data":"9db2af5ba787db30d6c2f5234097623118dbb1e00a9557cf55456b9665756933"} Feb 25 15:55:39 crc kubenswrapper[4937]: I0225 15:55:39.670868 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:39 crc kubenswrapper[4937]: I0225 15:55:39.675198 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:39 crc kubenswrapper[4937]: I0225 15:55:39.706661 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" podStartSLOduration=5.706640423 podStartE2EDuration="5.706640423s" podCreationTimestamp="2026-02-25 15:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:55:39.690842324 +0000 UTC m=+590.704234214" watchObservedRunningTime="2026-02-25 15:55:39.706640423 +0000 UTC m=+590.720032313" Feb 25 15:55:54 crc kubenswrapper[4937]: I0225 15:55:54.336979 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-857cd749c-f8nrr"] Feb 25 15:55:54 crc kubenswrapper[4937]: I0225 15:55:54.337768 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" podUID="11c71b93-1dfa-4837-a5e9-00f14115b891" containerName="controller-manager" containerID="cri-o://b2793e87548575b3c29f84c50fa20e490bb4de84eb1b11c039d467a4c673a59f" gracePeriod=30 Feb 25 15:55:54 crc kubenswrapper[4937]: I0225 15:55:54.437124 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d79766549-sd626"] Feb 25 15:55:54 crc kubenswrapper[4937]: I0225 15:55:54.437351 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" podUID="54d62779-8acc-4cf5-99f0-a3389d1af7cd" containerName="route-controller-manager" containerID="cri-o://e638a815875f4b6d1dbf4c5867f120b560f7cf9f4bca8cee0b3ec562c1957b0a" gracePeriod=30 Feb 25 15:55:54 crc kubenswrapper[4937]: I0225 15:55:54.792518 4937 generic.go:334] "Generic (PLEG): container finished" podID="11c71b93-1dfa-4837-a5e9-00f14115b891" containerID="b2793e87548575b3c29f84c50fa20e490bb4de84eb1b11c039d467a4c673a59f" exitCode=0 Feb 25 15:55:54 crc kubenswrapper[4937]: I0225 15:55:54.792609 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" event={"ID":"11c71b93-1dfa-4837-a5e9-00f14115b891","Type":"ContainerDied","Data":"b2793e87548575b3c29f84c50fa20e490bb4de84eb1b11c039d467a4c673a59f"} Feb 25 15:55:54 crc kubenswrapper[4937]: I0225 15:55:54.795428 4937 generic.go:334] "Generic (PLEG): container finished" podID="54d62779-8acc-4cf5-99f0-a3389d1af7cd" containerID="e638a815875f4b6d1dbf4c5867f120b560f7cf9f4bca8cee0b3ec562c1957b0a" exitCode=0 Feb 25 15:55:54 crc kubenswrapper[4937]: I0225 15:55:54.795507 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" event={"ID":"54d62779-8acc-4cf5-99f0-a3389d1af7cd","Type":"ContainerDied","Data":"e638a815875f4b6d1dbf4c5867f120b560f7cf9f4bca8cee0b3ec562c1957b0a"} Feb 25 15:55:54 crc kubenswrapper[4937]: I0225 15:55:54.986384 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.073262 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.176262 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjslg\" (UniqueName: \"kubernetes.io/projected/54d62779-8acc-4cf5-99f0-a3389d1af7cd-kube-api-access-sjslg\") pod \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.176305 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-proxy-ca-bundles\") pod \"11c71b93-1dfa-4837-a5e9-00f14115b891\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.176350 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54d62779-8acc-4cf5-99f0-a3389d1af7cd-serving-cert\") pod \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.176366 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-client-ca\") pod \"11c71b93-1dfa-4837-a5e9-00f14115b891\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.176383 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-config\") pod \"11c71b93-1dfa-4837-a5e9-00f14115b891\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.176413 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11c71b93-1dfa-4837-a5e9-00f14115b891-serving-cert\") pod \"11c71b93-1dfa-4837-a5e9-00f14115b891\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.176432 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d62779-8acc-4cf5-99f0-a3389d1af7cd-config\") pod \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.176520 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54d62779-8acc-4cf5-99f0-a3389d1af7cd-client-ca\") pod \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\" (UID: \"54d62779-8acc-4cf5-99f0-a3389d1af7cd\") " Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.176541 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lmtg\" (UniqueName: \"kubernetes.io/projected/11c71b93-1dfa-4837-a5e9-00f14115b891-kube-api-access-2lmtg\") pod \"11c71b93-1dfa-4837-a5e9-00f14115b891\" (UID: \"11c71b93-1dfa-4837-a5e9-00f14115b891\") " Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.177154 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "11c71b93-1dfa-4837-a5e9-00f14115b891" (UID: "11c71b93-1dfa-4837-a5e9-00f14115b891"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.177257 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d62779-8acc-4cf5-99f0-a3389d1af7cd-client-ca" (OuterVolumeSpecName: "client-ca") pod "54d62779-8acc-4cf5-99f0-a3389d1af7cd" (UID: "54d62779-8acc-4cf5-99f0-a3389d1af7cd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.177278 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-client-ca" (OuterVolumeSpecName: "client-ca") pod "11c71b93-1dfa-4837-a5e9-00f14115b891" (UID: "11c71b93-1dfa-4837-a5e9-00f14115b891"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.177324 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d62779-8acc-4cf5-99f0-a3389d1af7cd-config" (OuterVolumeSpecName: "config") pod "54d62779-8acc-4cf5-99f0-a3389d1af7cd" (UID: "54d62779-8acc-4cf5-99f0-a3389d1af7cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.177332 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-config" (OuterVolumeSpecName: "config") pod "11c71b93-1dfa-4837-a5e9-00f14115b891" (UID: "11c71b93-1dfa-4837-a5e9-00f14115b891"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.181698 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54d62779-8acc-4cf5-99f0-a3389d1af7cd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "54d62779-8acc-4cf5-99f0-a3389d1af7cd" (UID: "54d62779-8acc-4cf5-99f0-a3389d1af7cd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.181766 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d62779-8acc-4cf5-99f0-a3389d1af7cd-kube-api-access-sjslg" (OuterVolumeSpecName: "kube-api-access-sjslg") pod "54d62779-8acc-4cf5-99f0-a3389d1af7cd" (UID: "54d62779-8acc-4cf5-99f0-a3389d1af7cd"). InnerVolumeSpecName "kube-api-access-sjslg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.181852 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c71b93-1dfa-4837-a5e9-00f14115b891-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "11c71b93-1dfa-4837-a5e9-00f14115b891" (UID: "11c71b93-1dfa-4837-a5e9-00f14115b891"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.181981 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c71b93-1dfa-4837-a5e9-00f14115b891-kube-api-access-2lmtg" (OuterVolumeSpecName: "kube-api-access-2lmtg") pod "11c71b93-1dfa-4837-a5e9-00f14115b891" (UID: "11c71b93-1dfa-4837-a5e9-00f14115b891"). InnerVolumeSpecName "kube-api-access-2lmtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.278029 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjslg\" (UniqueName: \"kubernetes.io/projected/54d62779-8acc-4cf5-99f0-a3389d1af7cd-kube-api-access-sjslg\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.278090 4937 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.278111 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54d62779-8acc-4cf5-99f0-a3389d1af7cd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.278135 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.278206 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11c71b93-1dfa-4837-a5e9-00f14115b891-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.278233 4937 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11c71b93-1dfa-4837-a5e9-00f14115b891-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.278255 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d62779-8acc-4cf5-99f0-a3389d1af7cd-config\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.278307 4937 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54d62779-8acc-4cf5-99f0-a3389d1af7cd-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.278335 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lmtg\" (UniqueName: \"kubernetes.io/projected/11c71b93-1dfa-4837-a5e9-00f14115b891-kube-api-access-2lmtg\") on node \"crc\" DevicePath \"\"" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.806580 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.806590 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-857cd749c-f8nrr" event={"ID":"11c71b93-1dfa-4837-a5e9-00f14115b891","Type":"ContainerDied","Data":"9db2af5ba787db30d6c2f5234097623118dbb1e00a9557cf55456b9665756933"} Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.808523 4937 scope.go:117] "RemoveContainer" containerID="b2793e87548575b3c29f84c50fa20e490bb4de84eb1b11c039d467a4c673a59f" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.809561 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" event={"ID":"54d62779-8acc-4cf5-99f0-a3389d1af7cd","Type":"ContainerDied","Data":"0ef278c0ecf8653de71ed66535b9f66565fe6bb6f2c24c8614bc4ee6cdfddf1b"} Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.809633 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d79766549-sd626" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.828654 4937 scope.go:117] "RemoveContainer" containerID="e638a815875f4b6d1dbf4c5867f120b560f7cf9f4bca8cee0b3ec562c1957b0a" Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.850356 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-857cd749c-f8nrr"] Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.863599 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-857cd749c-f8nrr"] Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.867302 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d79766549-sd626"] Feb 25 15:55:55 crc kubenswrapper[4937]: I0225 15:55:55.871325 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d79766549-sd626"] Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.312645 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2"] Feb 25 15:55:56 crc kubenswrapper[4937]: E0225 15:55:56.312940 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d62779-8acc-4cf5-99f0-a3389d1af7cd" containerName="route-controller-manager" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.312955 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d62779-8acc-4cf5-99f0-a3389d1af7cd" containerName="route-controller-manager" Feb 25 15:55:56 crc kubenswrapper[4937]: E0225 15:55:56.312972 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c71b93-1dfa-4837-a5e9-00f14115b891" containerName="controller-manager" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.312984 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c71b93-1dfa-4837-a5e9-00f14115b891" containerName="controller-manager" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.313120 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c71b93-1dfa-4837-a5e9-00f14115b891" containerName="controller-manager" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.313138 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d62779-8acc-4cf5-99f0-a3389d1af7cd" containerName="route-controller-manager" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.313698 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.315939 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c845d5669-lwb7k"] Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.316990 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.317858 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.318409 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.318612 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.318790 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.319166 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.319589 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.319768 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.319931 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.320198 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.324173 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.324544 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.327731 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.327976 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2"] Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.332751 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c845d5669-lwb7k"] Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.349056 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.494443 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzl2f\" (UniqueName: \"kubernetes.io/projected/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-kube-api-access-pzl2f\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.494562 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-proxy-ca-bundles\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.494598 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef24f605-7dad-4ad7-b3cd-783f67312a1f-client-ca\") pod \"route-controller-manager-99c8db557-d7nv2\" (UID: \"ef24f605-7dad-4ad7-b3cd-783f67312a1f\") " pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.494634 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-client-ca\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.494735 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef24f605-7dad-4ad7-b3cd-783f67312a1f-config\") pod \"route-controller-manager-99c8db557-d7nv2\" (UID: \"ef24f605-7dad-4ad7-b3cd-783f67312a1f\") " pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.494767 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-serving-cert\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.494801 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef24f605-7dad-4ad7-b3cd-783f67312a1f-serving-cert\") pod \"route-controller-manager-99c8db557-d7nv2\" (UID: \"ef24f605-7dad-4ad7-b3cd-783f67312a1f\") " pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.494846 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-config\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.494892 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhtbg\" (UniqueName: \"kubernetes.io/projected/ef24f605-7dad-4ad7-b3cd-783f67312a1f-kube-api-access-qhtbg\") pod \"route-controller-manager-99c8db557-d7nv2\" (UID: \"ef24f605-7dad-4ad7-b3cd-783f67312a1f\") " pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.595848 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzl2f\" (UniqueName: \"kubernetes.io/projected/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-kube-api-access-pzl2f\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.595904 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-proxy-ca-bundles\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.595935 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef24f605-7dad-4ad7-b3cd-783f67312a1f-client-ca\") pod \"route-controller-manager-99c8db557-d7nv2\" (UID: \"ef24f605-7dad-4ad7-b3cd-783f67312a1f\") " pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.595960 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-client-ca\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.596000 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef24f605-7dad-4ad7-b3cd-783f67312a1f-config\") pod \"route-controller-manager-99c8db557-d7nv2\" (UID: \"ef24f605-7dad-4ad7-b3cd-783f67312a1f\") " pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.596025 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-serving-cert\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.596048 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef24f605-7dad-4ad7-b3cd-783f67312a1f-serving-cert\") pod \"route-controller-manager-99c8db557-d7nv2\" (UID: \"ef24f605-7dad-4ad7-b3cd-783f67312a1f\") " pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.596090 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-config\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.597624 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef24f605-7dad-4ad7-b3cd-783f67312a1f-client-ca\") pod \"route-controller-manager-99c8db557-d7nv2\" (UID: \"ef24f605-7dad-4ad7-b3cd-783f67312a1f\") " pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.597631 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhtbg\" (UniqueName: \"kubernetes.io/projected/ef24f605-7dad-4ad7-b3cd-783f67312a1f-kube-api-access-qhtbg\") pod \"route-controller-manager-99c8db557-d7nv2\" (UID: \"ef24f605-7dad-4ad7-b3cd-783f67312a1f\") " pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.597646 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-proxy-ca-bundles\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.598127 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-client-ca\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.601566 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-serving-cert\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.601868 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef24f605-7dad-4ad7-b3cd-783f67312a1f-config\") pod \"route-controller-manager-99c8db557-d7nv2\" (UID: \"ef24f605-7dad-4ad7-b3cd-783f67312a1f\") " pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.601878 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef24f605-7dad-4ad7-b3cd-783f67312a1f-serving-cert\") pod \"route-controller-manager-99c8db557-d7nv2\" (UID: \"ef24f605-7dad-4ad7-b3cd-783f67312a1f\") " pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.602611 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-config\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.627933 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzl2f\" (UniqueName: \"kubernetes.io/projected/e23f6016-a7b8-4c6a-8838-aee11a75ba0c-kube-api-access-pzl2f\") pod \"controller-manager-7c845d5669-lwb7k\" (UID: \"e23f6016-a7b8-4c6a-8838-aee11a75ba0c\") " pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.634428 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhtbg\" (UniqueName: \"kubernetes.io/projected/ef24f605-7dad-4ad7-b3cd-783f67312a1f-kube-api-access-qhtbg\") pod \"route-controller-manager-99c8db557-d7nv2\" (UID: \"ef24f605-7dad-4ad7-b3cd-783f67312a1f\") " pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.647799 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:56 crc kubenswrapper[4937]: I0225 15:55:56.657783 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:57 crc kubenswrapper[4937]: I0225 15:55:57.404107 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c71b93-1dfa-4837-a5e9-00f14115b891" path="/var/lib/kubelet/pods/11c71b93-1dfa-4837-a5e9-00f14115b891/volumes" Feb 25 15:55:57 crc kubenswrapper[4937]: I0225 15:55:57.405025 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54d62779-8acc-4cf5-99f0-a3389d1af7cd" path="/var/lib/kubelet/pods/54d62779-8acc-4cf5-99f0-a3389d1af7cd/volumes" Feb 25 15:55:57 crc kubenswrapper[4937]: I0225 15:55:57.737723 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c845d5669-lwb7k"] Feb 25 15:55:57 crc kubenswrapper[4937]: I0225 15:55:57.801543 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2"] Feb 25 15:55:57 crc kubenswrapper[4937]: I0225 15:55:57.826863 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" event={"ID":"e23f6016-a7b8-4c6a-8838-aee11a75ba0c","Type":"ContainerStarted","Data":"7917a185469d8a3e96e1a3b8af5848295bd4d64b9538cf9f8cb2d9ca89e92e93"} Feb 25 15:55:57 crc kubenswrapper[4937]: I0225 15:55:57.829573 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" event={"ID":"ef24f605-7dad-4ad7-b3cd-783f67312a1f","Type":"ContainerStarted","Data":"e50151d1de1a4864df6bd77da7e212bb6d21215573f7fdbcd09f9ed8883f0a7e"} Feb 25 15:55:58 crc kubenswrapper[4937]: I0225 15:55:58.836531 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" event={"ID":"e23f6016-a7b8-4c6a-8838-aee11a75ba0c","Type":"ContainerStarted","Data":"cc0231300058cbcc40ad2bc4c0d6f330d9ba6d385a9b4ccd961306ed31bfa63d"} Feb 25 15:55:58 crc kubenswrapper[4937]: I0225 15:55:58.840409 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:58 crc kubenswrapper[4937]: I0225 15:55:58.842313 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" event={"ID":"ef24f605-7dad-4ad7-b3cd-783f67312a1f","Type":"ContainerStarted","Data":"5459f0475693c1d6048862aa55a87bf14ad45348a9915918fece88f91ba4cc3b"} Feb 25 15:55:58 crc kubenswrapper[4937]: I0225 15:55:58.843325 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:58 crc kubenswrapper[4937]: I0225 15:55:58.847482 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" Feb 25 15:55:58 crc kubenswrapper[4937]: I0225 15:55:58.847635 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" Feb 25 15:55:58 crc kubenswrapper[4937]: I0225 15:55:58.867192 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c845d5669-lwb7k" podStartSLOduration=4.867165884 podStartE2EDuration="4.867165884s" podCreationTimestamp="2026-02-25 15:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:55:58.858142717 +0000 UTC m=+609.871534647" watchObservedRunningTime="2026-02-25 15:55:58.867165884 +0000 UTC m=+609.880557814" Feb 25 15:55:58 crc kubenswrapper[4937]: I0225 15:55:58.903410 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-99c8db557-d7nv2" podStartSLOduration=4.903387378 podStartE2EDuration="4.903387378s" podCreationTimestamp="2026-02-25 15:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:55:58.900820773 +0000 UTC m=+609.914212673" watchObservedRunningTime="2026-02-25 15:55:58.903387378 +0000 UTC m=+609.916779268" Feb 25 15:56:00 crc kubenswrapper[4937]: I0225 15:56:00.151951 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533916-54vkc"] Feb 25 15:56:00 crc kubenswrapper[4937]: I0225 15:56:00.153018 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533916-54vkc" Feb 25 15:56:00 crc kubenswrapper[4937]: I0225 15:56:00.157101 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 15:56:00 crc kubenswrapper[4937]: I0225 15:56:00.157616 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 15:56:00 crc kubenswrapper[4937]: I0225 15:56:00.159825 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 15:56:00 crc kubenswrapper[4937]: I0225 15:56:00.161392 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533916-54vkc"] Feb 25 15:56:00 crc kubenswrapper[4937]: I0225 15:56:00.351361 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrx7k\" (UniqueName: \"kubernetes.io/projected/98e29452-771e-4286-9898-d0b8a14bb0e7-kube-api-access-lrx7k\") pod \"auto-csr-approver-29533916-54vkc\" (UID: \"98e29452-771e-4286-9898-d0b8a14bb0e7\") " pod="openshift-infra/auto-csr-approver-29533916-54vkc" Feb 25 15:56:00 crc kubenswrapper[4937]: I0225 15:56:00.452949 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrx7k\" (UniqueName: \"kubernetes.io/projected/98e29452-771e-4286-9898-d0b8a14bb0e7-kube-api-access-lrx7k\") pod \"auto-csr-approver-29533916-54vkc\" (UID: \"98e29452-771e-4286-9898-d0b8a14bb0e7\") " pod="openshift-infra/auto-csr-approver-29533916-54vkc" Feb 25 15:56:00 crc kubenswrapper[4937]: I0225 15:56:00.490131 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrx7k\" (UniqueName: \"kubernetes.io/projected/98e29452-771e-4286-9898-d0b8a14bb0e7-kube-api-access-lrx7k\") pod \"auto-csr-approver-29533916-54vkc\" (UID: \"98e29452-771e-4286-9898-d0b8a14bb0e7\") " pod="openshift-infra/auto-csr-approver-29533916-54vkc" Feb 25 15:56:00 crc kubenswrapper[4937]: I0225 15:56:00.779551 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533916-54vkc" Feb 25 15:56:01 crc kubenswrapper[4937]: I0225 15:56:01.186665 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533916-54vkc"] Feb 25 15:56:01 crc kubenswrapper[4937]: W0225 15:56:01.200298 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e29452_771e_4286_9898_d0b8a14bb0e7.slice/crio-24588dfc661959bec2d85b8c98a514e7c85cc44d5b38530b73d3438368415cee WatchSource:0}: Error finding container 24588dfc661959bec2d85b8c98a514e7c85cc44d5b38530b73d3438368415cee: Status 404 returned error can't find the container with id 24588dfc661959bec2d85b8c98a514e7c85cc44d5b38530b73d3438368415cee Feb 25 15:56:01 crc kubenswrapper[4937]: I0225 15:56:01.203270 4937 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 15:56:01 crc kubenswrapper[4937]: I0225 15:56:01.880395 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533916-54vkc" event={"ID":"98e29452-771e-4286-9898-d0b8a14bb0e7","Type":"ContainerStarted","Data":"24588dfc661959bec2d85b8c98a514e7c85cc44d5b38530b73d3438368415cee"} Feb 25 15:56:02 crc kubenswrapper[4937]: I0225 15:56:02.889273 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533916-54vkc" event={"ID":"98e29452-771e-4286-9898-d0b8a14bb0e7","Type":"ContainerStarted","Data":"db6e92b472e6c485207bd90cc9b71ceb4b2506038df4c6f7fd5b19e7900b60ce"} Feb 25 15:56:02 crc kubenswrapper[4937]: I0225 15:56:02.913787 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533916-54vkc" podStartSLOduration=1.579942549 podStartE2EDuration="2.913759868s" podCreationTimestamp="2026-02-25 15:56:00 +0000 UTC" firstStartedPulling="2026-02-25 15:56:01.203028653 +0000 UTC m=+612.216420543" lastFinishedPulling="2026-02-25 15:56:02.536845912 +0000 UTC m=+613.550237862" observedRunningTime="2026-02-25 15:56:02.906991637 +0000 UTC m=+613.920383527" watchObservedRunningTime="2026-02-25 15:56:02.913759868 +0000 UTC m=+613.927151788" Feb 25 15:56:03 crc kubenswrapper[4937]: I0225 15:56:03.897294 4937 generic.go:334] "Generic (PLEG): container finished" podID="98e29452-771e-4286-9898-d0b8a14bb0e7" containerID="db6e92b472e6c485207bd90cc9b71ceb4b2506038df4c6f7fd5b19e7900b60ce" exitCode=0 Feb 25 15:56:03 crc kubenswrapper[4937]: I0225 15:56:03.897366 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533916-54vkc" event={"ID":"98e29452-771e-4286-9898-d0b8a14bb0e7","Type":"ContainerDied","Data":"db6e92b472e6c485207bd90cc9b71ceb4b2506038df4c6f7fd5b19e7900b60ce"} Feb 25 15:56:05 crc kubenswrapper[4937]: I0225 15:56:05.281362 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533916-54vkc" Feb 25 15:56:05 crc kubenswrapper[4937]: I0225 15:56:05.420440 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrx7k\" (UniqueName: \"kubernetes.io/projected/98e29452-771e-4286-9898-d0b8a14bb0e7-kube-api-access-lrx7k\") pod \"98e29452-771e-4286-9898-d0b8a14bb0e7\" (UID: \"98e29452-771e-4286-9898-d0b8a14bb0e7\") " Feb 25 15:56:05 crc kubenswrapper[4937]: I0225 15:56:05.428858 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e29452-771e-4286-9898-d0b8a14bb0e7-kube-api-access-lrx7k" (OuterVolumeSpecName: "kube-api-access-lrx7k") pod "98e29452-771e-4286-9898-d0b8a14bb0e7" (UID: "98e29452-771e-4286-9898-d0b8a14bb0e7"). InnerVolumeSpecName "kube-api-access-lrx7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:56:05 crc kubenswrapper[4937]: I0225 15:56:05.521940 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrx7k\" (UniqueName: \"kubernetes.io/projected/98e29452-771e-4286-9898-d0b8a14bb0e7-kube-api-access-lrx7k\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:05 crc kubenswrapper[4937]: I0225 15:56:05.917458 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533916-54vkc" event={"ID":"98e29452-771e-4286-9898-d0b8a14bb0e7","Type":"ContainerDied","Data":"24588dfc661959bec2d85b8c98a514e7c85cc44d5b38530b73d3438368415cee"} Feb 25 15:56:05 crc kubenswrapper[4937]: I0225 15:56:05.917662 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24588dfc661959bec2d85b8c98a514e7c85cc44d5b38530b73d3438368415cee" Feb 25 15:56:05 crc kubenswrapper[4937]: I0225 15:56:05.917804 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533916-54vkc" Feb 25 15:56:05 crc kubenswrapper[4937]: I0225 15:56:05.969467 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533910-n4bgf"] Feb 25 15:56:05 crc kubenswrapper[4937]: I0225 15:56:05.972652 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533910-n4bgf"] Feb 25 15:56:07 crc kubenswrapper[4937]: I0225 15:56:07.377553 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e6e93a3-6673-464f-84a3-5585a6cbc0a8" path="/var/lib/kubelet/pods/6e6e93a3-6673-464f-84a3-5585a6cbc0a8/volumes" Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.531453 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-54sqd"] Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.533673 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-54sqd" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" containerName="registry-server" containerID="cri-o://388d7566d873d97ed691ebcb5de6c17076905eda19ee0aee5fc9fdb5f629be47" gracePeriod=30 Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.547479 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l8xkp"] Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.547900 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l8xkp" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" containerName="registry-server" containerID="cri-o://35a8ce8d7df5f75b07fc4c92a504fb00daf584ba25f36d27f4257af87a40d6e9" gracePeriod=30 Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.562034 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r5bpn"] Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.562356 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" containerID="cri-o://73b9753cfdf2d17dad772e595e65de687ed31b8d429b43e2af19002994219da0" gracePeriod=30 Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.578783 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsxxs"] Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.579106 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gsxxs" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" containerName="registry-server" containerID="cri-o://ee3fb963f39141446a76e27da17b6222c78c082df995f1bf53363abfd73aebb0" gracePeriod=30 Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.584859 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-scndr"] Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.585134 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-scndr" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" containerName="registry-server" containerID="cri-o://c17984d83e8adf1748985ec49508a7289a7aee5dcf65ab112c2b756d0df53d3a" gracePeriod=30 Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.596063 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nbj4m"] Feb 25 15:56:26 crc kubenswrapper[4937]: E0225 15:56:26.596346 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e29452-771e-4286-9898-d0b8a14bb0e7" containerName="oc" Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.596361 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e29452-771e-4286-9898-d0b8a14bb0e7" containerName="oc" Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.596461 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e29452-771e-4286-9898-d0b8a14bb0e7" containerName="oc" Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.597145 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.599524 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nbj4m"] Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.644867 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw22d\" (UniqueName: \"kubernetes.io/projected/44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6-kube-api-access-kw22d\") pod \"marketplace-operator-79b997595-nbj4m\" (UID: \"44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.645042 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nbj4m\" (UID: \"44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.645105 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nbj4m\" (UID: \"44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.746218 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw22d\" (UniqueName: \"kubernetes.io/projected/44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6-kube-api-access-kw22d\") pod \"marketplace-operator-79b997595-nbj4m\" (UID: \"44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.746763 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nbj4m\" (UID: \"44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.746796 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nbj4m\" (UID: \"44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.748419 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nbj4m\" (UID: \"44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.755860 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nbj4m\" (UID: \"44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.763728 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw22d\" (UniqueName: \"kubernetes.io/projected/44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6-kube-api-access-kw22d\") pod \"marketplace-operator-79b997595-nbj4m\" (UID: \"44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" Feb 25 15:56:26 crc kubenswrapper[4937]: I0225 15:56:26.895574 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.017617 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.063645 4937 generic.go:334] "Generic (PLEG): container finished" podID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" containerID="ee3fb963f39141446a76e27da17b6222c78c082df995f1bf53363abfd73aebb0" exitCode=0 Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.063738 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsxxs" event={"ID":"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b","Type":"ContainerDied","Data":"ee3fb963f39141446a76e27da17b6222c78c082df995f1bf53363abfd73aebb0"} Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.066061 4937 generic.go:334] "Generic (PLEG): container finished" podID="dc970acf-3cdb-4951-8f35-705ce003550f" containerID="c17984d83e8adf1748985ec49508a7289a7aee5dcf65ab112c2b756d0df53d3a" exitCode=0 Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.066155 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scndr" event={"ID":"dc970acf-3cdb-4951-8f35-705ce003550f","Type":"ContainerDied","Data":"c17984d83e8adf1748985ec49508a7289a7aee5dcf65ab112c2b756d0df53d3a"} Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.068879 4937 generic.go:334] "Generic (PLEG): container finished" podID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" containerID="35a8ce8d7df5f75b07fc4c92a504fb00daf584ba25f36d27f4257af87a40d6e9" exitCode=0 Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.068989 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8xkp" event={"ID":"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15","Type":"ContainerDied","Data":"35a8ce8d7df5f75b07fc4c92a504fb00daf584ba25f36d27f4257af87a40d6e9"} Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.071744 4937 generic.go:334] "Generic (PLEG): container finished" podID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" containerID="388d7566d873d97ed691ebcb5de6c17076905eda19ee0aee5fc9fdb5f629be47" exitCode=0 Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.071824 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54sqd" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.071845 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54sqd" event={"ID":"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c","Type":"ContainerDied","Data":"388d7566d873d97ed691ebcb5de6c17076905eda19ee0aee5fc9fdb5f629be47"} Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.071963 4937 scope.go:117] "RemoveContainer" containerID="388d7566d873d97ed691ebcb5de6c17076905eda19ee0aee5fc9fdb5f629be47" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.072012 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54sqd" event={"ID":"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c","Type":"ContainerDied","Data":"72df2f9c05873a1ef6f206ebb11f72d680317d19b0fc9a3261249d147dccc712"} Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.074579 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-r5bpn_906509ff-be49-4c28-95b5-9f80cb885ece/marketplace-operator/2.log" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.074613 4937 generic.go:334] "Generic (PLEG): container finished" podID="906509ff-be49-4c28-95b5-9f80cb885ece" containerID="73b9753cfdf2d17dad772e595e65de687ed31b8d429b43e2af19002994219da0" exitCode=0 Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.074640 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" event={"ID":"906509ff-be49-4c28-95b5-9f80cb885ece","Type":"ContainerDied","Data":"73b9753cfdf2d17dad772e595e65de687ed31b8d429b43e2af19002994219da0"} Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.108352 4937 scope.go:117] "RemoveContainer" containerID="9f1a8301500e621cbb777d1bdbf3e0d51d4638711e48923cf27422aa55f63267" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.139141 4937 scope.go:117] "RemoveContainer" containerID="e13bcce78b2fbe9c228202da6952862f6a9736a03c836494aa4cc106e67db74b" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.150861 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-catalog-content\") pod \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\" (UID: \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.150922 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-utilities\") pod \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\" (UID: \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.150995 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4vcl\" (UniqueName: \"kubernetes.io/projected/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-kube-api-access-h4vcl\") pod \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\" (UID: \"af3c547e-6bf2-4fd5-b375-5ad1c2c6959c\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.151847 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-utilities" (OuterVolumeSpecName: "utilities") pod "af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" (UID: "af3c547e-6bf2-4fd5-b375-5ad1c2c6959c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.157364 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-kube-api-access-h4vcl" (OuterVolumeSpecName: "kube-api-access-h4vcl") pod "af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" (UID: "af3c547e-6bf2-4fd5-b375-5ad1c2c6959c"). InnerVolumeSpecName "kube-api-access-h4vcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.164130 4937 scope.go:117] "RemoveContainer" containerID="388d7566d873d97ed691ebcb5de6c17076905eda19ee0aee5fc9fdb5f629be47" Feb 25 15:56:27 crc kubenswrapper[4937]: E0225 15:56:27.167852 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388d7566d873d97ed691ebcb5de6c17076905eda19ee0aee5fc9fdb5f629be47\": container with ID starting with 388d7566d873d97ed691ebcb5de6c17076905eda19ee0aee5fc9fdb5f629be47 not found: ID does not exist" containerID="388d7566d873d97ed691ebcb5de6c17076905eda19ee0aee5fc9fdb5f629be47" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.167915 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388d7566d873d97ed691ebcb5de6c17076905eda19ee0aee5fc9fdb5f629be47"} err="failed to get container status \"388d7566d873d97ed691ebcb5de6c17076905eda19ee0aee5fc9fdb5f629be47\": rpc error: code = NotFound desc = could not find container \"388d7566d873d97ed691ebcb5de6c17076905eda19ee0aee5fc9fdb5f629be47\": container with ID starting with 388d7566d873d97ed691ebcb5de6c17076905eda19ee0aee5fc9fdb5f629be47 not found: ID does not exist" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.167950 4937 scope.go:117] "RemoveContainer" containerID="9f1a8301500e621cbb777d1bdbf3e0d51d4638711e48923cf27422aa55f63267" Feb 25 15:56:27 crc kubenswrapper[4937]: E0225 15:56:27.168364 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f1a8301500e621cbb777d1bdbf3e0d51d4638711e48923cf27422aa55f63267\": container with ID starting with 9f1a8301500e621cbb777d1bdbf3e0d51d4638711e48923cf27422aa55f63267 not found: ID does not exist" containerID="9f1a8301500e621cbb777d1bdbf3e0d51d4638711e48923cf27422aa55f63267" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.168406 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f1a8301500e621cbb777d1bdbf3e0d51d4638711e48923cf27422aa55f63267"} err="failed to get container status \"9f1a8301500e621cbb777d1bdbf3e0d51d4638711e48923cf27422aa55f63267\": rpc error: code = NotFound desc = could not find container \"9f1a8301500e621cbb777d1bdbf3e0d51d4638711e48923cf27422aa55f63267\": container with ID starting with 9f1a8301500e621cbb777d1bdbf3e0d51d4638711e48923cf27422aa55f63267 not found: ID does not exist" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.168437 4937 scope.go:117] "RemoveContainer" containerID="e13bcce78b2fbe9c228202da6952862f6a9736a03c836494aa4cc106e67db74b" Feb 25 15:56:27 crc kubenswrapper[4937]: E0225 15:56:27.168903 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e13bcce78b2fbe9c228202da6952862f6a9736a03c836494aa4cc106e67db74b\": container with ID starting with e13bcce78b2fbe9c228202da6952862f6a9736a03c836494aa4cc106e67db74b not found: ID does not exist" containerID="e13bcce78b2fbe9c228202da6952862f6a9736a03c836494aa4cc106e67db74b" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.168956 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e13bcce78b2fbe9c228202da6952862f6a9736a03c836494aa4cc106e67db74b"} err="failed to get container status \"e13bcce78b2fbe9c228202da6952862f6a9736a03c836494aa4cc106e67db74b\": rpc error: code = NotFound desc = could not find container \"e13bcce78b2fbe9c228202da6952862f6a9736a03c836494aa4cc106e67db74b\": container with ID starting with e13bcce78b2fbe9c228202da6952862f6a9736a03c836494aa4cc106e67db74b not found: ID does not exist" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.168975 4937 scope.go:117] "RemoveContainer" containerID="74b7b25a4e53b9ba1dc21b5169e1b8a1dd55cbb38c6b31e7b8e2d6cb94af884a" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.206548 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" (UID: "af3c547e-6bf2-4fd5-b375-5ad1c2c6959c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.252131 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.252162 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.252174 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4vcl\" (UniqueName: \"kubernetes.io/projected/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c-kube-api-access-h4vcl\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:27 crc kubenswrapper[4937]: E0225 15:56:27.309592 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ee3fb963f39141446a76e27da17b6222c78c082df995f1bf53363abfd73aebb0 is running failed: container process not found" containerID="ee3fb963f39141446a76e27da17b6222c78c082df995f1bf53363abfd73aebb0" cmd=["grpc_health_probe","-addr=:50051"] Feb 25 15:56:27 crc kubenswrapper[4937]: E0225 15:56:27.310413 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ee3fb963f39141446a76e27da17b6222c78c082df995f1bf53363abfd73aebb0 is running failed: container process not found" containerID="ee3fb963f39141446a76e27da17b6222c78c082df995f1bf53363abfd73aebb0" cmd=["grpc_health_probe","-addr=:50051"] Feb 25 15:56:27 crc kubenswrapper[4937]: E0225 15:56:27.311798 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ee3fb963f39141446a76e27da17b6222c78c082df995f1bf53363abfd73aebb0 is running failed: container process not found" containerID="ee3fb963f39141446a76e27da17b6222c78c082df995f1bf53363abfd73aebb0" cmd=["grpc_health_probe","-addr=:50051"] Feb 25 15:56:27 crc kubenswrapper[4937]: E0225 15:56:27.311845 4937 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ee3fb963f39141446a76e27da17b6222c78c082df995f1bf53363abfd73aebb0 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-gsxxs" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" containerName="registry-server" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.340670 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.347860 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.353995 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgc6d\" (UniqueName: \"kubernetes.io/projected/dc970acf-3cdb-4951-8f35-705ce003550f-kube-api-access-sgc6d\") pod \"dc970acf-3cdb-4951-8f35-705ce003550f\" (UID: \"dc970acf-3cdb-4951-8f35-705ce003550f\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.354056 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/906509ff-be49-4c28-95b5-9f80cb885ece-marketplace-operator-metrics\") pod \"906509ff-be49-4c28-95b5-9f80cb885ece\" (UID: \"906509ff-be49-4c28-95b5-9f80cb885ece\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.354091 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/906509ff-be49-4c28-95b5-9f80cb885ece-marketplace-trusted-ca\") pod \"906509ff-be49-4c28-95b5-9f80cb885ece\" (UID: \"906509ff-be49-4c28-95b5-9f80cb885ece\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.354130 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc970acf-3cdb-4951-8f35-705ce003550f-utilities\") pod \"dc970acf-3cdb-4951-8f35-705ce003550f\" (UID: \"dc970acf-3cdb-4951-8f35-705ce003550f\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.354156 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc970acf-3cdb-4951-8f35-705ce003550f-catalog-content\") pod \"dc970acf-3cdb-4951-8f35-705ce003550f\" (UID: \"dc970acf-3cdb-4951-8f35-705ce003550f\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.354221 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvhhm\" (UniqueName: \"kubernetes.io/projected/906509ff-be49-4c28-95b5-9f80cb885ece-kube-api-access-hvhhm\") pod \"906509ff-be49-4c28-95b5-9f80cb885ece\" (UID: \"906509ff-be49-4c28-95b5-9f80cb885ece\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.355668 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/906509ff-be49-4c28-95b5-9f80cb885ece-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "906509ff-be49-4c28-95b5-9f80cb885ece" (UID: "906509ff-be49-4c28-95b5-9f80cb885ece"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.355879 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc970acf-3cdb-4951-8f35-705ce003550f-utilities" (OuterVolumeSpecName: "utilities") pod "dc970acf-3cdb-4951-8f35-705ce003550f" (UID: "dc970acf-3cdb-4951-8f35-705ce003550f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.358104 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc970acf-3cdb-4951-8f35-705ce003550f-kube-api-access-sgc6d" (OuterVolumeSpecName: "kube-api-access-sgc6d") pod "dc970acf-3cdb-4951-8f35-705ce003550f" (UID: "dc970acf-3cdb-4951-8f35-705ce003550f"). InnerVolumeSpecName "kube-api-access-sgc6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.360073 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/906509ff-be49-4c28-95b5-9f80cb885ece-kube-api-access-hvhhm" (OuterVolumeSpecName: "kube-api-access-hvhhm") pod "906509ff-be49-4c28-95b5-9f80cb885ece" (UID: "906509ff-be49-4c28-95b5-9f80cb885ece"). InnerVolumeSpecName "kube-api-access-hvhhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.361891 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906509ff-be49-4c28-95b5-9f80cb885ece-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "906509ff-be49-4c28-95b5-9f80cb885ece" (UID: "906509ff-be49-4c28-95b5-9f80cb885ece"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.366044 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.437947 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-54sqd"] Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.442477 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-54sqd"] Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.454638 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zktg\" (UniqueName: \"kubernetes.io/projected/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-kube-api-access-7zktg\") pod \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\" (UID: \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.454723 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-utilities\") pod \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\" (UID: \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.454757 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-catalog-content\") pod \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\" (UID: \"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.455054 4937 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/906509ff-be49-4c28-95b5-9f80cb885ece-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.455071 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc970acf-3cdb-4951-8f35-705ce003550f-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.455083 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvhhm\" (UniqueName: \"kubernetes.io/projected/906509ff-be49-4c28-95b5-9f80cb885ece-kube-api-access-hvhhm\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.455095 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgc6d\" (UniqueName: \"kubernetes.io/projected/dc970acf-3cdb-4951-8f35-705ce003550f-kube-api-access-sgc6d\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.455107 4937 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/906509ff-be49-4c28-95b5-9f80cb885ece-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.456815 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-utilities" (OuterVolumeSpecName: "utilities") pod "e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" (UID: "e1e375ad-9093-48c0-8f06-d8ae9ad9b46b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.459676 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-kube-api-access-7zktg" (OuterVolumeSpecName: "kube-api-access-7zktg") pod "e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" (UID: "e1e375ad-9093-48c0-8f06-d8ae9ad9b46b"). InnerVolumeSpecName "kube-api-access-7zktg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.483829 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" (UID: "e1e375ad-9093-48c0-8f06-d8ae9ad9b46b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.486662 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.529103 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc970acf-3cdb-4951-8f35-705ce003550f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc970acf-3cdb-4951-8f35-705ce003550f" (UID: "dc970acf-3cdb-4951-8f35-705ce003550f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.549266 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nbj4m"] Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.555978 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-catalog-content\") pod \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\" (UID: \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.556068 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-utilities\") pod \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\" (UID: \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.556097 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppp72\" (UniqueName: \"kubernetes.io/projected/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-kube-api-access-ppp72\") pod \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\" (UID: \"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15\") " Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.556622 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zktg\" (UniqueName: \"kubernetes.io/projected/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-kube-api-access-7zktg\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.556645 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.556658 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.556672 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc970acf-3cdb-4951-8f35-705ce003550f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.557228 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-utilities" (OuterVolumeSpecName: "utilities") pod "534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" (UID: "534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.561844 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-kube-api-access-ppp72" (OuterVolumeSpecName: "kube-api-access-ppp72") pod "534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" (UID: "534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15"). InnerVolumeSpecName "kube-api-access-ppp72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.621168 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" (UID: "534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.657055 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.657091 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:27 crc kubenswrapper[4937]: I0225 15:56:27.657106 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppp72\" (UniqueName: \"kubernetes.io/projected/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15-kube-api-access-ppp72\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.062377 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6r6g7"] Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.093559 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8xkp" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.093547 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8xkp" event={"ID":"534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15","Type":"ContainerDied","Data":"9dd9f93e5c1ebefdb1b24a9366cd82c7d13f957fef131406131ae48b58f5402c"} Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.093751 4937 scope.go:117] "RemoveContainer" containerID="35a8ce8d7df5f75b07fc4c92a504fb00daf584ba25f36d27f4257af87a40d6e9" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.098121 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.098100 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" event={"ID":"906509ff-be49-4c28-95b5-9f80cb885ece","Type":"ContainerDied","Data":"0a3340ad207d5a69041d7085021f5b12193a326e7ab9a63db22cad6f744722a1"} Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.106358 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scndr" event={"ID":"dc970acf-3cdb-4951-8f35-705ce003550f","Type":"ContainerDied","Data":"eedbc6a8629695a67983066428f03439464cbbb8a703e13a57024be745db089a"} Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.106409 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scndr" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.125439 4937 scope.go:117] "RemoveContainer" containerID="987763612bb0738cda4ef2c12210a475f4d3e910f1a99c8e8231554980baa657" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.128956 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsxxs" event={"ID":"e1e375ad-9093-48c0-8f06-d8ae9ad9b46b","Type":"ContainerDied","Data":"f60397d71bc30f5827caa56948c580a994e0fe5420e93840875a795c70c05e1b"} Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.129148 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsxxs" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.149650 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" event={"ID":"44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6","Type":"ContainerStarted","Data":"01bc677f276ed7420079135946398ca3481b6c573b00e629451ea4475ba60b5c"} Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.149696 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" event={"ID":"44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6","Type":"ContainerStarted","Data":"2bcfb0bd8b3f6f2b1a83c3b2081e536b81999af8b3ce4df871786aebea98d681"} Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.150627 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.155009 4937 scope.go:117] "RemoveContainer" containerID="0b24f47aca9f8a7cd478f192b8c4f1c809ef2760c03e7a8fc5afa608bee01929" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.164778 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.172528 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r5bpn"] Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.175142 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r5bpn"] Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.186973 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l8xkp"] Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.195632 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l8xkp"] Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.196383 4937 scope.go:117] "RemoveContainer" containerID="73b9753cfdf2d17dad772e595e65de687ed31b8d429b43e2af19002994219da0" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.220688 4937 scope.go:117] "RemoveContainer" containerID="c17984d83e8adf1748985ec49508a7289a7aee5dcf65ab112c2b756d0df53d3a" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.225099 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nbj4m" podStartSLOduration=2.225079472 podStartE2EDuration="2.225079472s" podCreationTimestamp="2026-02-25 15:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:56:28.221117012 +0000 UTC m=+639.234508912" watchObservedRunningTime="2026-02-25 15:56:28.225079472 +0000 UTC m=+639.238471362" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.235692 4937 scope.go:117] "RemoveContainer" containerID="af08644e562333e00be97dae4e3a62ec89446ec0ceeda29e11bec5daa22f4d10" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.249422 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsxxs"] Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.251756 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsxxs"] Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.263998 4937 scope.go:117] "RemoveContainer" containerID="674a86c36c74adcc12046294564d7e1dc8bbf3dfe538e769d6cea8b801395692" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.268098 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-scndr"] Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.271414 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-scndr"] Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.288960 4937 scope.go:117] "RemoveContainer" containerID="ee3fb963f39141446a76e27da17b6222c78c082df995f1bf53363abfd73aebb0" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.312779 4937 scope.go:117] "RemoveContainer" containerID="ce1478d91047d9edfb30553059efd253e36a6dd4b8e10817ae72246d156f6666" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.314783 4937 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r5bpn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: i/o timeout (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.314849 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r5bpn" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.327545 4937 scope.go:117] "RemoveContainer" containerID="87fd06301df005ac0266157675a57d8f32c754d569794513d573076fc4940d4c" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.744902 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8n9q4"] Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745118 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" containerName="registry-server" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745131 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" containerName="registry-server" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745146 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" containerName="extract-utilities" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745153 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" containerName="extract-utilities" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745163 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" containerName="registry-server" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745170 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" containerName="registry-server" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745181 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" containerName="extract-content" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745187 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" containerName="extract-content" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745198 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" containerName="extract-utilities" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745204 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" containerName="extract-utilities" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745214 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" containerName="extract-utilities" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745220 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" containerName="extract-utilities" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745231 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" containerName="extract-content" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745238 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" containerName="extract-content" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745247 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745255 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745263 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" containerName="extract-content" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745269 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" containerName="extract-content" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745277 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745283 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745292 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" containerName="registry-server" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745298 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" containerName="registry-server" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745306 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" containerName="extract-content" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745312 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" containerName="extract-content" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745323 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" containerName="extract-utilities" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745329 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" containerName="extract-utilities" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745337 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" containerName="registry-server" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745344 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" containerName="registry-server" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745354 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745361 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745469 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745503 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" containerName="registry-server" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745513 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" containerName="registry-server" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745523 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745533 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" containerName="registry-server" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745542 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" containerName="registry-server" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745553 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" Feb 25 15:56:28 crc kubenswrapper[4937]: E0225 15:56:28.745645 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745653 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.745761 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" containerName="marketplace-operator" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.746827 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.749228 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.760065 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8n9q4"] Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.773596 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4f6s\" (UniqueName: \"kubernetes.io/projected/22c34722-ce3c-4f34-9a65-3a8ccdbb0673-kube-api-access-j4f6s\") pod \"redhat-marketplace-8n9q4\" (UID: \"22c34722-ce3c-4f34-9a65-3a8ccdbb0673\") " pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.773660 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c34722-ce3c-4f34-9a65-3a8ccdbb0673-utilities\") pod \"redhat-marketplace-8n9q4\" (UID: \"22c34722-ce3c-4f34-9a65-3a8ccdbb0673\") " pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.773696 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c34722-ce3c-4f34-9a65-3a8ccdbb0673-catalog-content\") pod \"redhat-marketplace-8n9q4\" (UID: \"22c34722-ce3c-4f34-9a65-3a8ccdbb0673\") " pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.874831 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4f6s\" (UniqueName: \"kubernetes.io/projected/22c34722-ce3c-4f34-9a65-3a8ccdbb0673-kube-api-access-j4f6s\") pod \"redhat-marketplace-8n9q4\" (UID: \"22c34722-ce3c-4f34-9a65-3a8ccdbb0673\") " pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.874875 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c34722-ce3c-4f34-9a65-3a8ccdbb0673-utilities\") pod \"redhat-marketplace-8n9q4\" (UID: \"22c34722-ce3c-4f34-9a65-3a8ccdbb0673\") " pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.874900 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c34722-ce3c-4f34-9a65-3a8ccdbb0673-catalog-content\") pod \"redhat-marketplace-8n9q4\" (UID: \"22c34722-ce3c-4f34-9a65-3a8ccdbb0673\") " pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.875458 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c34722-ce3c-4f34-9a65-3a8ccdbb0673-catalog-content\") pod \"redhat-marketplace-8n9q4\" (UID: \"22c34722-ce3c-4f34-9a65-3a8ccdbb0673\") " pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.875655 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c34722-ce3c-4f34-9a65-3a8ccdbb0673-utilities\") pod \"redhat-marketplace-8n9q4\" (UID: \"22c34722-ce3c-4f34-9a65-3a8ccdbb0673\") " pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.898549 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4f6s\" (UniqueName: \"kubernetes.io/projected/22c34722-ce3c-4f34-9a65-3a8ccdbb0673-kube-api-access-j4f6s\") pod \"redhat-marketplace-8n9q4\" (UID: \"22c34722-ce3c-4f34-9a65-3a8ccdbb0673\") " pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.954120 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fswxh"] Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.955370 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.957747 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 25 15:56:28 crc kubenswrapper[4937]: I0225 15:56:28.976800 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fswxh"] Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.065348 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.077211 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1d5578b-cf1d-4208-91b2-2019dff70a16-catalog-content\") pod \"redhat-operators-fswxh\" (UID: \"d1d5578b-cf1d-4208-91b2-2019dff70a16\") " pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.077286 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4rkv\" (UniqueName: \"kubernetes.io/projected/d1d5578b-cf1d-4208-91b2-2019dff70a16-kube-api-access-j4rkv\") pod \"redhat-operators-fswxh\" (UID: \"d1d5578b-cf1d-4208-91b2-2019dff70a16\") " pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.077694 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1d5578b-cf1d-4208-91b2-2019dff70a16-utilities\") pod \"redhat-operators-fswxh\" (UID: \"d1d5578b-cf1d-4208-91b2-2019dff70a16\") " pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.178924 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1d5578b-cf1d-4208-91b2-2019dff70a16-utilities\") pod \"redhat-operators-fswxh\" (UID: \"d1d5578b-cf1d-4208-91b2-2019dff70a16\") " pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.179009 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1d5578b-cf1d-4208-91b2-2019dff70a16-catalog-content\") pod \"redhat-operators-fswxh\" (UID: \"d1d5578b-cf1d-4208-91b2-2019dff70a16\") " pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.179043 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4rkv\" (UniqueName: \"kubernetes.io/projected/d1d5578b-cf1d-4208-91b2-2019dff70a16-kube-api-access-j4rkv\") pod \"redhat-operators-fswxh\" (UID: \"d1d5578b-cf1d-4208-91b2-2019dff70a16\") " pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.181062 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1d5578b-cf1d-4208-91b2-2019dff70a16-catalog-content\") pod \"redhat-operators-fswxh\" (UID: \"d1d5578b-cf1d-4208-91b2-2019dff70a16\") " pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.183058 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1d5578b-cf1d-4208-91b2-2019dff70a16-utilities\") pod \"redhat-operators-fswxh\" (UID: \"d1d5578b-cf1d-4208-91b2-2019dff70a16\") " pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.199332 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4rkv\" (UniqueName: \"kubernetes.io/projected/d1d5578b-cf1d-4208-91b2-2019dff70a16-kube-api-access-j4rkv\") pod \"redhat-operators-fswxh\" (UID: \"d1d5578b-cf1d-4208-91b2-2019dff70a16\") " pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.274920 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.375201 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15" path="/var/lib/kubelet/pods/534065ad-eb70-4ddb-bbf6-b9dbcfc2dc15/volumes" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.376058 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="906509ff-be49-4c28-95b5-9f80cb885ece" path="/var/lib/kubelet/pods/906509ff-be49-4c28-95b5-9f80cb885ece/volumes" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.376661 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3c547e-6bf2-4fd5-b375-5ad1c2c6959c" path="/var/lib/kubelet/pods/af3c547e-6bf2-4fd5-b375-5ad1c2c6959c/volumes" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.377983 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc970acf-3cdb-4951-8f35-705ce003550f" path="/var/lib/kubelet/pods/dc970acf-3cdb-4951-8f35-705ce003550f/volumes" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.378750 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e375ad-9093-48c0-8f06-d8ae9ad9b46b" path="/var/lib/kubelet/pods/e1e375ad-9093-48c0-8f06-d8ae9ad9b46b/volumes" Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.492522 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8n9q4"] Feb 25 15:56:29 crc kubenswrapper[4937]: W0225 15:56:29.497244 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22c34722_ce3c_4f34_9a65_3a8ccdbb0673.slice/crio-355db19d65c02f0474c9a457480d252757f263e74d99fa0111c7d7b662a48be5 WatchSource:0}: Error finding container 355db19d65c02f0474c9a457480d252757f263e74d99fa0111c7d7b662a48be5: Status 404 returned error can't find the container with id 355db19d65c02f0474c9a457480d252757f263e74d99fa0111c7d7b662a48be5 Feb 25 15:56:29 crc kubenswrapper[4937]: I0225 15:56:29.727677 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fswxh"] Feb 25 15:56:29 crc kubenswrapper[4937]: W0225 15:56:29.769287 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1d5578b_cf1d_4208_91b2_2019dff70a16.slice/crio-0f7fb2e43cf60193752230d9c091ad7d8b41a615624a601d07fa44da626efaa9 WatchSource:0}: Error finding container 0f7fb2e43cf60193752230d9c091ad7d8b41a615624a601d07fa44da626efaa9: Status 404 returned error can't find the container with id 0f7fb2e43cf60193752230d9c091ad7d8b41a615624a601d07fa44da626efaa9 Feb 25 15:56:30 crc kubenswrapper[4937]: I0225 15:56:30.187753 4937 generic.go:334] "Generic (PLEG): container finished" podID="22c34722-ce3c-4f34-9a65-3a8ccdbb0673" containerID="21ffd85145a98f3d7c4bb955c4d86ef93495ed72e3ccd3ee314e5f9bcfb92acc" exitCode=0 Feb 25 15:56:30 crc kubenswrapper[4937]: I0225 15:56:30.187869 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8n9q4" event={"ID":"22c34722-ce3c-4f34-9a65-3a8ccdbb0673","Type":"ContainerDied","Data":"21ffd85145a98f3d7c4bb955c4d86ef93495ed72e3ccd3ee314e5f9bcfb92acc"} Feb 25 15:56:30 crc kubenswrapper[4937]: I0225 15:56:30.187928 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8n9q4" event={"ID":"22c34722-ce3c-4f34-9a65-3a8ccdbb0673","Type":"ContainerStarted","Data":"355db19d65c02f0474c9a457480d252757f263e74d99fa0111c7d7b662a48be5"} Feb 25 15:56:30 crc kubenswrapper[4937]: I0225 15:56:30.190248 4937 generic.go:334] "Generic (PLEG): container finished" podID="d1d5578b-cf1d-4208-91b2-2019dff70a16" containerID="6376654cecda31816716804df1bb94b771297f7824701eb998b2471b9ff14eb3" exitCode=0 Feb 25 15:56:30 crc kubenswrapper[4937]: I0225 15:56:30.190561 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fswxh" event={"ID":"d1d5578b-cf1d-4208-91b2-2019dff70a16","Type":"ContainerDied","Data":"6376654cecda31816716804df1bb94b771297f7824701eb998b2471b9ff14eb3"} Feb 25 15:56:30 crc kubenswrapper[4937]: I0225 15:56:30.190605 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fswxh" event={"ID":"d1d5578b-cf1d-4208-91b2-2019dff70a16","Type":"ContainerStarted","Data":"0f7fb2e43cf60193752230d9c091ad7d8b41a615624a601d07fa44da626efaa9"} Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.147582 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hmbgl"] Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.149696 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.153096 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.157201 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hmbgl"] Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.306742 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f770fc-fde6-4340-8c2a-a33e619cb169-catalog-content\") pod \"certified-operators-hmbgl\" (UID: \"e9f770fc-fde6-4340-8c2a-a33e619cb169\") " pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.306794 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s962d\" (UniqueName: \"kubernetes.io/projected/e9f770fc-fde6-4340-8c2a-a33e619cb169-kube-api-access-s962d\") pod \"certified-operators-hmbgl\" (UID: \"e9f770fc-fde6-4340-8c2a-a33e619cb169\") " pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.306818 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f770fc-fde6-4340-8c2a-a33e619cb169-utilities\") pod \"certified-operators-hmbgl\" (UID: \"e9f770fc-fde6-4340-8c2a-a33e619cb169\") " pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.351552 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6dqwg"] Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.353592 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.355888 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.364462 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6dqwg"] Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.408639 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s962d\" (UniqueName: \"kubernetes.io/projected/e9f770fc-fde6-4340-8c2a-a33e619cb169-kube-api-access-s962d\") pod \"certified-operators-hmbgl\" (UID: \"e9f770fc-fde6-4340-8c2a-a33e619cb169\") " pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.408708 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f770fc-fde6-4340-8c2a-a33e619cb169-utilities\") pod \"certified-operators-hmbgl\" (UID: \"e9f770fc-fde6-4340-8c2a-a33e619cb169\") " pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.408747 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18855c35-8e7b-4089-848f-e325b779dc51-catalog-content\") pod \"community-operators-6dqwg\" (UID: \"18855c35-8e7b-4089-848f-e325b779dc51\") " pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.408801 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hww7n\" (UniqueName: \"kubernetes.io/projected/18855c35-8e7b-4089-848f-e325b779dc51-kube-api-access-hww7n\") pod \"community-operators-6dqwg\" (UID: \"18855c35-8e7b-4089-848f-e325b779dc51\") " pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.408845 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18855c35-8e7b-4089-848f-e325b779dc51-utilities\") pod \"community-operators-6dqwg\" (UID: \"18855c35-8e7b-4089-848f-e325b779dc51\") " pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.408917 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f770fc-fde6-4340-8c2a-a33e619cb169-catalog-content\") pod \"certified-operators-hmbgl\" (UID: \"e9f770fc-fde6-4340-8c2a-a33e619cb169\") " pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.409180 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f770fc-fde6-4340-8c2a-a33e619cb169-utilities\") pod \"certified-operators-hmbgl\" (UID: \"e9f770fc-fde6-4340-8c2a-a33e619cb169\") " pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.409349 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f770fc-fde6-4340-8c2a-a33e619cb169-catalog-content\") pod \"certified-operators-hmbgl\" (UID: \"e9f770fc-fde6-4340-8c2a-a33e619cb169\") " pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.431237 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s962d\" (UniqueName: \"kubernetes.io/projected/e9f770fc-fde6-4340-8c2a-a33e619cb169-kube-api-access-s962d\") pod \"certified-operators-hmbgl\" (UID: \"e9f770fc-fde6-4340-8c2a-a33e619cb169\") " pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.468435 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.509673 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hww7n\" (UniqueName: \"kubernetes.io/projected/18855c35-8e7b-4089-848f-e325b779dc51-kube-api-access-hww7n\") pod \"community-operators-6dqwg\" (UID: \"18855c35-8e7b-4089-848f-e325b779dc51\") " pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.509720 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18855c35-8e7b-4089-848f-e325b779dc51-utilities\") pod \"community-operators-6dqwg\" (UID: \"18855c35-8e7b-4089-848f-e325b779dc51\") " pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.509779 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18855c35-8e7b-4089-848f-e325b779dc51-catalog-content\") pod \"community-operators-6dqwg\" (UID: \"18855c35-8e7b-4089-848f-e325b779dc51\") " pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.510164 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18855c35-8e7b-4089-848f-e325b779dc51-catalog-content\") pod \"community-operators-6dqwg\" (UID: \"18855c35-8e7b-4089-848f-e325b779dc51\") " pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.510296 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18855c35-8e7b-4089-848f-e325b779dc51-utilities\") pod \"community-operators-6dqwg\" (UID: \"18855c35-8e7b-4089-848f-e325b779dc51\") " pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.529042 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hww7n\" (UniqueName: \"kubernetes.io/projected/18855c35-8e7b-4089-848f-e325b779dc51-kube-api-access-hww7n\") pod \"community-operators-6dqwg\" (UID: \"18855c35-8e7b-4089-848f-e325b779dc51\") " pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.685924 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:31 crc kubenswrapper[4937]: I0225 15:56:31.907048 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hmbgl"] Feb 25 15:56:31 crc kubenswrapper[4937]: W0225 15:56:31.916878 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9f770fc_fde6_4340_8c2a_a33e619cb169.slice/crio-0f4e3e4c8bf090c3cdf4d476c991afb5f5ea9a000368b7d24f5bc47e07ffc11a WatchSource:0}: Error finding container 0f4e3e4c8bf090c3cdf4d476c991afb5f5ea9a000368b7d24f5bc47e07ffc11a: Status 404 returned error can't find the container with id 0f4e3e4c8bf090c3cdf4d476c991afb5f5ea9a000368b7d24f5bc47e07ffc11a Feb 25 15:56:32 crc kubenswrapper[4937]: I0225 15:56:32.202714 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6dqwg"] Feb 25 15:56:32 crc kubenswrapper[4937]: W0225 15:56:32.205026 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18855c35_8e7b_4089_848f_e325b779dc51.slice/crio-c31cc8c5d2f4def26894cd9326adf7c6f7ae53132f770bf023c2f8d0e7fb149c WatchSource:0}: Error finding container c31cc8c5d2f4def26894cd9326adf7c6f7ae53132f770bf023c2f8d0e7fb149c: Status 404 returned error can't find the container with id c31cc8c5d2f4def26894cd9326adf7c6f7ae53132f770bf023c2f8d0e7fb149c Feb 25 15:56:32 crc kubenswrapper[4937]: I0225 15:56:32.205386 4937 generic.go:334] "Generic (PLEG): container finished" podID="22c34722-ce3c-4f34-9a65-3a8ccdbb0673" containerID="b04b4be50cfabd7017af0affffd2bf55ca06b61ec25d0e29bd8bc5a146ab012c" exitCode=0 Feb 25 15:56:32 crc kubenswrapper[4937]: I0225 15:56:32.205468 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8n9q4" event={"ID":"22c34722-ce3c-4f34-9a65-3a8ccdbb0673","Type":"ContainerDied","Data":"b04b4be50cfabd7017af0affffd2bf55ca06b61ec25d0e29bd8bc5a146ab012c"} Feb 25 15:56:32 crc kubenswrapper[4937]: I0225 15:56:32.208366 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fswxh" event={"ID":"d1d5578b-cf1d-4208-91b2-2019dff70a16","Type":"ContainerStarted","Data":"30bd740768c46947803f7d6cb8e5511558a291660c5c58c6f05c12c16770b9bd"} Feb 25 15:56:32 crc kubenswrapper[4937]: I0225 15:56:32.231388 4937 generic.go:334] "Generic (PLEG): container finished" podID="e9f770fc-fde6-4340-8c2a-a33e619cb169" containerID="933cdd88bf964e57f8f7a1c4e561f04f1c78964d441dbad7eee28a91cf02ba3c" exitCode=0 Feb 25 15:56:32 crc kubenswrapper[4937]: I0225 15:56:32.231451 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmbgl" event={"ID":"e9f770fc-fde6-4340-8c2a-a33e619cb169","Type":"ContainerDied","Data":"933cdd88bf964e57f8f7a1c4e561f04f1c78964d441dbad7eee28a91cf02ba3c"} Feb 25 15:56:32 crc kubenswrapper[4937]: I0225 15:56:32.231492 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmbgl" event={"ID":"e9f770fc-fde6-4340-8c2a-a33e619cb169","Type":"ContainerStarted","Data":"0f4e3e4c8bf090c3cdf4d476c991afb5f5ea9a000368b7d24f5bc47e07ffc11a"} Feb 25 15:56:33 crc kubenswrapper[4937]: I0225 15:56:33.238784 4937 generic.go:334] "Generic (PLEG): container finished" podID="18855c35-8e7b-4089-848f-e325b779dc51" containerID="f32c07e9103803045aa4bd823377507ae5b6a2d5368f9c1db8fe57b2e514dbbe" exitCode=0 Feb 25 15:56:33 crc kubenswrapper[4937]: I0225 15:56:33.238885 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6dqwg" event={"ID":"18855c35-8e7b-4089-848f-e325b779dc51","Type":"ContainerDied","Data":"f32c07e9103803045aa4bd823377507ae5b6a2d5368f9c1db8fe57b2e514dbbe"} Feb 25 15:56:33 crc kubenswrapper[4937]: I0225 15:56:33.239408 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6dqwg" event={"ID":"18855c35-8e7b-4089-848f-e325b779dc51","Type":"ContainerStarted","Data":"c31cc8c5d2f4def26894cd9326adf7c6f7ae53132f770bf023c2f8d0e7fb149c"} Feb 25 15:56:33 crc kubenswrapper[4937]: I0225 15:56:33.247087 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8n9q4" event={"ID":"22c34722-ce3c-4f34-9a65-3a8ccdbb0673","Type":"ContainerStarted","Data":"514a5bd5b30a4d9416b79149ad5a33aea47fa38a49edba29f2afedc83c638b0d"} Feb 25 15:56:33 crc kubenswrapper[4937]: I0225 15:56:33.253441 4937 generic.go:334] "Generic (PLEG): container finished" podID="d1d5578b-cf1d-4208-91b2-2019dff70a16" containerID="30bd740768c46947803f7d6cb8e5511558a291660c5c58c6f05c12c16770b9bd" exitCode=0 Feb 25 15:56:33 crc kubenswrapper[4937]: I0225 15:56:33.253519 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fswxh" event={"ID":"d1d5578b-cf1d-4208-91b2-2019dff70a16","Type":"ContainerDied","Data":"30bd740768c46947803f7d6cb8e5511558a291660c5c58c6f05c12c16770b9bd"} Feb 25 15:56:33 crc kubenswrapper[4937]: I0225 15:56:33.281204 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8n9q4" podStartSLOduration=2.581325417 podStartE2EDuration="5.281188998s" podCreationTimestamp="2026-02-25 15:56:28 +0000 UTC" firstStartedPulling="2026-02-25 15:56:30.19010817 +0000 UTC m=+641.203500060" lastFinishedPulling="2026-02-25 15:56:32.889971751 +0000 UTC m=+643.903363641" observedRunningTime="2026-02-25 15:56:33.276677874 +0000 UTC m=+644.290069764" watchObservedRunningTime="2026-02-25 15:56:33.281188998 +0000 UTC m=+644.294580888" Feb 25 15:56:35 crc kubenswrapper[4937]: I0225 15:56:35.266214 4937 generic.go:334] "Generic (PLEG): container finished" podID="e9f770fc-fde6-4340-8c2a-a33e619cb169" containerID="0f13b05073e93fe4f8c0b053a6797be189e7156234226e4c16195b927c05c16a" exitCode=0 Feb 25 15:56:35 crc kubenswrapper[4937]: I0225 15:56:35.266271 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmbgl" event={"ID":"e9f770fc-fde6-4340-8c2a-a33e619cb169","Type":"ContainerDied","Data":"0f13b05073e93fe4f8c0b053a6797be189e7156234226e4c16195b927c05c16a"} Feb 25 15:56:35 crc kubenswrapper[4937]: I0225 15:56:35.269107 4937 generic.go:334] "Generic (PLEG): container finished" podID="18855c35-8e7b-4089-848f-e325b779dc51" containerID="30664f2ee4cbc69d8796026dad9f8016272855565c716accc4bf2a9ed9108043" exitCode=0 Feb 25 15:56:35 crc kubenswrapper[4937]: I0225 15:56:35.269300 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6dqwg" event={"ID":"18855c35-8e7b-4089-848f-e325b779dc51","Type":"ContainerDied","Data":"30664f2ee4cbc69d8796026dad9f8016272855565c716accc4bf2a9ed9108043"} Feb 25 15:56:35 crc kubenswrapper[4937]: I0225 15:56:35.275067 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fswxh" event={"ID":"d1d5578b-cf1d-4208-91b2-2019dff70a16","Type":"ContainerStarted","Data":"e868ecd8d2835606b82d27df2f013393390778f741c4be91cf483f1201657a7e"} Feb 25 15:56:35 crc kubenswrapper[4937]: I0225 15:56:35.306582 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fswxh" podStartSLOduration=3.035146106 podStartE2EDuration="7.306563458s" podCreationTimestamp="2026-02-25 15:56:28 +0000 UTC" firstStartedPulling="2026-02-25 15:56:30.191838784 +0000 UTC m=+641.205230674" lastFinishedPulling="2026-02-25 15:56:34.463256136 +0000 UTC m=+645.476648026" observedRunningTime="2026-02-25 15:56:35.303463489 +0000 UTC m=+646.316855419" watchObservedRunningTime="2026-02-25 15:56:35.306563458 +0000 UTC m=+646.319955348" Feb 25 15:56:36 crc kubenswrapper[4937]: I0225 15:56:36.286432 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmbgl" event={"ID":"e9f770fc-fde6-4340-8c2a-a33e619cb169","Type":"ContainerStarted","Data":"7f39be550bf2ea50e27e61e024e732b6a9902f5a1cdbeae7babb598137832834"} Feb 25 15:56:37 crc kubenswrapper[4937]: I0225 15:56:37.299421 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6dqwg" event={"ID":"18855c35-8e7b-4089-848f-e325b779dc51","Type":"ContainerStarted","Data":"16023d3b41a44c5c86f8ad6cea3fa2b30ae73ae337d8fe56d94a279a915e8793"} Feb 25 15:56:37 crc kubenswrapper[4937]: I0225 15:56:37.317534 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hmbgl" podStartSLOduration=2.46914471 podStartE2EDuration="6.317509593s" podCreationTimestamp="2026-02-25 15:56:31 +0000 UTC" firstStartedPulling="2026-02-25 15:56:32.23372283 +0000 UTC m=+643.247114720" lastFinishedPulling="2026-02-25 15:56:36.082087713 +0000 UTC m=+647.095479603" observedRunningTime="2026-02-25 15:56:37.316167389 +0000 UTC m=+648.329559279" watchObservedRunningTime="2026-02-25 15:56:37.317509593 +0000 UTC m=+648.330901483" Feb 25 15:56:37 crc kubenswrapper[4937]: I0225 15:56:37.339148 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6dqwg" podStartSLOduration=3.327515582 podStartE2EDuration="6.339125824s" podCreationTimestamp="2026-02-25 15:56:31 +0000 UTC" firstStartedPulling="2026-02-25 15:56:33.240237765 +0000 UTC m=+644.253629655" lastFinishedPulling="2026-02-25 15:56:36.251848007 +0000 UTC m=+647.265239897" observedRunningTime="2026-02-25 15:56:37.337127003 +0000 UTC m=+648.350518893" watchObservedRunningTime="2026-02-25 15:56:37.339125824 +0000 UTC m=+648.352517714" Feb 25 15:56:39 crc kubenswrapper[4937]: I0225 15:56:39.065474 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:39 crc kubenswrapper[4937]: I0225 15:56:39.065756 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:39 crc kubenswrapper[4937]: I0225 15:56:39.118235 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:39 crc kubenswrapper[4937]: I0225 15:56:39.275888 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:39 crc kubenswrapper[4937]: I0225 15:56:39.275946 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:39 crc kubenswrapper[4937]: I0225 15:56:39.352156 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8n9q4" Feb 25 15:56:40 crc kubenswrapper[4937]: I0225 15:56:40.314727 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fswxh" podUID="d1d5578b-cf1d-4208-91b2-2019dff70a16" containerName="registry-server" probeResult="failure" output=< Feb 25 15:56:40 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 15:56:40 crc kubenswrapper[4937]: > Feb 25 15:56:41 crc kubenswrapper[4937]: I0225 15:56:41.469403 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:41 crc kubenswrapper[4937]: I0225 15:56:41.469501 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:41 crc kubenswrapper[4937]: I0225 15:56:41.501416 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 15:56:41 crc kubenswrapper[4937]: I0225 15:56:41.501505 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 15:56:41 crc kubenswrapper[4937]: I0225 15:56:41.511926 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:41 crc kubenswrapper[4937]: I0225 15:56:41.686974 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:41 crc kubenswrapper[4937]: I0225 15:56:41.687042 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:41 crc kubenswrapper[4937]: I0225 15:56:41.727030 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:42 crc kubenswrapper[4937]: I0225 15:56:42.370107 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 15:56:42 crc kubenswrapper[4937]: I0225 15:56:42.418240 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6dqwg" Feb 25 15:56:49 crc kubenswrapper[4937]: I0225 15:56:49.328283 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:49 crc kubenswrapper[4937]: I0225 15:56:49.379063 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fswxh" Feb 25 15:56:53 crc kubenswrapper[4937]: I0225 15:56:53.089011 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" podUID="6a145826-4023-4211-aa90-aedba31d17c1" containerName="oauth-openshift" containerID="cri-o://84cc9893d6aba5a2fca1959fe45c368ec777f3735bf80bb1ae8b72cc6a23f627" gracePeriod=15 Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.004063 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.043484 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-64679bdf5-hpq7q"] Feb 25 15:56:54 crc kubenswrapper[4937]: E0225 15:56:54.043674 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a145826-4023-4211-aa90-aedba31d17c1" containerName="oauth-openshift" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.043685 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a145826-4023-4211-aa90-aedba31d17c1" containerName="oauth-openshift" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.043770 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a145826-4023-4211-aa90-aedba31d17c1" containerName="oauth-openshift" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.044098 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.057907 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-64679bdf5-hpq7q"] Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.180637 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-ocp-branding-template\") pod \"6a145826-4023-4211-aa90-aedba31d17c1\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.180685 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-router-certs\") pod \"6a145826-4023-4211-aa90-aedba31d17c1\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.180713 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-provider-selection\") pod \"6a145826-4023-4211-aa90-aedba31d17c1\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.180749 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfzkl\" (UniqueName: \"kubernetes.io/projected/6a145826-4023-4211-aa90-aedba31d17c1-kube-api-access-rfzkl\") pod \"6a145826-4023-4211-aa90-aedba31d17c1\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.180782 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-audit-policies\") pod \"6a145826-4023-4211-aa90-aedba31d17c1\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.180832 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-serving-cert\") pod \"6a145826-4023-4211-aa90-aedba31d17c1\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.180864 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-login\") pod \"6a145826-4023-4211-aa90-aedba31d17c1\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.180888 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-cliconfig\") pod \"6a145826-4023-4211-aa90-aedba31d17c1\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.180923 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-session\") pod \"6a145826-4023-4211-aa90-aedba31d17c1\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.180963 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-idp-0-file-data\") pod \"6a145826-4023-4211-aa90-aedba31d17c1\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.180987 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-error\") pod \"6a145826-4023-4211-aa90-aedba31d17c1\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181017 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-service-ca\") pod \"6a145826-4023-4211-aa90-aedba31d17c1\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181070 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-trusted-ca-bundle\") pod \"6a145826-4023-4211-aa90-aedba31d17c1\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181099 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a145826-4023-4211-aa90-aedba31d17c1-audit-dir\") pod \"6a145826-4023-4211-aa90-aedba31d17c1\" (UID: \"6a145826-4023-4211-aa90-aedba31d17c1\") " Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181284 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-router-certs\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181319 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181369 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-serving-cert\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181395 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-user-template-login\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181421 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-session\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181450 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl8p9\" (UniqueName: \"kubernetes.io/projected/b5437d57-f0bd-4df7-b310-c922e4df7555-kube-api-access-kl8p9\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181476 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5437d57-f0bd-4df7-b310-c922e4df7555-audit-policies\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181515 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5437d57-f0bd-4df7-b310-c922e4df7555-audit-dir\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181538 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-cliconfig\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181562 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181585 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-user-template-error\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181612 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181636 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.181661 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-service-ca\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.183008 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a145826-4023-4211-aa90-aedba31d17c1-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "6a145826-4023-4211-aa90-aedba31d17c1" (UID: "6a145826-4023-4211-aa90-aedba31d17c1"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.183169 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "6a145826-4023-4211-aa90-aedba31d17c1" (UID: "6a145826-4023-4211-aa90-aedba31d17c1"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.183179 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "6a145826-4023-4211-aa90-aedba31d17c1" (UID: "6a145826-4023-4211-aa90-aedba31d17c1"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.183209 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "6a145826-4023-4211-aa90-aedba31d17c1" (UID: "6a145826-4023-4211-aa90-aedba31d17c1"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.183572 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "6a145826-4023-4211-aa90-aedba31d17c1" (UID: "6a145826-4023-4211-aa90-aedba31d17c1"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.187238 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "6a145826-4023-4211-aa90-aedba31d17c1" (UID: "6a145826-4023-4211-aa90-aedba31d17c1"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.188122 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "6a145826-4023-4211-aa90-aedba31d17c1" (UID: "6a145826-4023-4211-aa90-aedba31d17c1"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.192124 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a145826-4023-4211-aa90-aedba31d17c1-kube-api-access-rfzkl" (OuterVolumeSpecName: "kube-api-access-rfzkl") pod "6a145826-4023-4211-aa90-aedba31d17c1" (UID: "6a145826-4023-4211-aa90-aedba31d17c1"). InnerVolumeSpecName "kube-api-access-rfzkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.192183 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "6a145826-4023-4211-aa90-aedba31d17c1" (UID: "6a145826-4023-4211-aa90-aedba31d17c1"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.195107 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "6a145826-4023-4211-aa90-aedba31d17c1" (UID: "6a145826-4023-4211-aa90-aedba31d17c1"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.195421 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "6a145826-4023-4211-aa90-aedba31d17c1" (UID: "6a145826-4023-4211-aa90-aedba31d17c1"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.196019 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "6a145826-4023-4211-aa90-aedba31d17c1" (UID: "6a145826-4023-4211-aa90-aedba31d17c1"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.196393 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "6a145826-4023-4211-aa90-aedba31d17c1" (UID: "6a145826-4023-4211-aa90-aedba31d17c1"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.196768 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "6a145826-4023-4211-aa90-aedba31d17c1" (UID: "6a145826-4023-4211-aa90-aedba31d17c1"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282601 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-service-ca\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282661 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-router-certs\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282681 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282717 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-serving-cert\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282737 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-user-template-login\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282755 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-session\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282777 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl8p9\" (UniqueName: \"kubernetes.io/projected/b5437d57-f0bd-4df7-b310-c922e4df7555-kube-api-access-kl8p9\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282798 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5437d57-f0bd-4df7-b310-c922e4df7555-audit-policies\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282837 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-cliconfig\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282852 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282868 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5437d57-f0bd-4df7-b310-c922e4df7555-audit-dir\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282884 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-user-template-error\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282899 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282917 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282953 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282965 4937 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a145826-4023-4211-aa90-aedba31d17c1-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282976 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282985 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.282994 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.283003 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfzkl\" (UniqueName: \"kubernetes.io/projected/6a145826-4023-4211-aa90-aedba31d17c1-kube-api-access-rfzkl\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.283012 4937 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.283020 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.283030 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.283039 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.283047 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.283055 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.283064 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.283073 4937 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a145826-4023-4211-aa90-aedba31d17c1-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.283069 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5437d57-f0bd-4df7-b310-c922e4df7555-audit-dir\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.283658 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-service-ca\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.283966 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5437d57-f0bd-4df7-b310-c922e4df7555-audit-policies\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.284129 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.284151 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-cliconfig\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.286270 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-user-template-login\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.287375 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-session\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.287511 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-serving-cert\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.287869 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-user-template-error\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.287875 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.288049 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-router-certs\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.289892 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.290646 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b5437d57-f0bd-4df7-b310-c922e4df7555-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.300741 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl8p9\" (UniqueName: \"kubernetes.io/projected/b5437d57-f0bd-4df7-b310-c922e4df7555-kube-api-access-kl8p9\") pod \"oauth-openshift-64679bdf5-hpq7q\" (UID: \"b5437d57-f0bd-4df7-b310-c922e4df7555\") " pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.355627 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.411330 4937 generic.go:334] "Generic (PLEG): container finished" podID="6a145826-4023-4211-aa90-aedba31d17c1" containerID="84cc9893d6aba5a2fca1959fe45c368ec777f3735bf80bb1ae8b72cc6a23f627" exitCode=0 Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.411375 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" event={"ID":"6a145826-4023-4211-aa90-aedba31d17c1","Type":"ContainerDied","Data":"84cc9893d6aba5a2fca1959fe45c368ec777f3735bf80bb1ae8b72cc6a23f627"} Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.411450 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" event={"ID":"6a145826-4023-4211-aa90-aedba31d17c1","Type":"ContainerDied","Data":"f4e58d6f14105b145f9147f255e2896f4d1d2e3ec67b44d86027c62fde8b8492"} Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.411474 4937 scope.go:117] "RemoveContainer" containerID="84cc9893d6aba5a2fca1959fe45c368ec777f3735bf80bb1ae8b72cc6a23f627" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.411872 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6r6g7" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.440942 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6r6g7"] Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.447608 4937 scope.go:117] "RemoveContainer" containerID="84cc9893d6aba5a2fca1959fe45c368ec777f3735bf80bb1ae8b72cc6a23f627" Feb 25 15:56:54 crc kubenswrapper[4937]: E0225 15:56:54.448637 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84cc9893d6aba5a2fca1959fe45c368ec777f3735bf80bb1ae8b72cc6a23f627\": container with ID starting with 84cc9893d6aba5a2fca1959fe45c368ec777f3735bf80bb1ae8b72cc6a23f627 not found: ID does not exist" containerID="84cc9893d6aba5a2fca1959fe45c368ec777f3735bf80bb1ae8b72cc6a23f627" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.448715 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84cc9893d6aba5a2fca1959fe45c368ec777f3735bf80bb1ae8b72cc6a23f627"} err="failed to get container status \"84cc9893d6aba5a2fca1959fe45c368ec777f3735bf80bb1ae8b72cc6a23f627\": rpc error: code = NotFound desc = could not find container \"84cc9893d6aba5a2fca1959fe45c368ec777f3735bf80bb1ae8b72cc6a23f627\": container with ID starting with 84cc9893d6aba5a2fca1959fe45c368ec777f3735bf80bb1ae8b72cc6a23f627 not found: ID does not exist" Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.451150 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6r6g7"] Feb 25 15:56:54 crc kubenswrapper[4937]: I0225 15:56:54.825090 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-64679bdf5-hpq7q"] Feb 25 15:56:54 crc kubenswrapper[4937]: W0225 15:56:54.829021 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5437d57_f0bd_4df7_b310_c922e4df7555.slice/crio-26828eda84d3847046cd2f63eb18bef058d9c3952e1ede09973589c00d1582f5 WatchSource:0}: Error finding container 26828eda84d3847046cd2f63eb18bef058d9c3952e1ede09973589c00d1582f5: Status 404 returned error can't find the container with id 26828eda84d3847046cd2f63eb18bef058d9c3952e1ede09973589c00d1582f5 Feb 25 15:56:55 crc kubenswrapper[4937]: I0225 15:56:55.373735 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a145826-4023-4211-aa90-aedba31d17c1" path="/var/lib/kubelet/pods/6a145826-4023-4211-aa90-aedba31d17c1/volumes" Feb 25 15:56:55 crc kubenswrapper[4937]: I0225 15:56:55.416524 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" event={"ID":"b5437d57-f0bd-4df7-b310-c922e4df7555","Type":"ContainerStarted","Data":"0362dd9119f220c3fd9934af3a094d0c47ead9855be7204261c8200aba23dea2"} Feb 25 15:56:55 crc kubenswrapper[4937]: I0225 15:56:55.416567 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" event={"ID":"b5437d57-f0bd-4df7-b310-c922e4df7555","Type":"ContainerStarted","Data":"26828eda84d3847046cd2f63eb18bef058d9c3952e1ede09973589c00d1582f5"} Feb 25 15:56:55 crc kubenswrapper[4937]: I0225 15:56:55.417140 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:56:55 crc kubenswrapper[4937]: I0225 15:56:55.434040 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" podStartSLOduration=27.434025557 podStartE2EDuration="27.434025557s" podCreationTimestamp="2026-02-25 15:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 15:56:55.433946235 +0000 UTC m=+666.447338115" watchObservedRunningTime="2026-02-25 15:56:55.434025557 +0000 UTC m=+666.447417447" Feb 25 15:56:55 crc kubenswrapper[4937]: I0225 15:56:55.594628 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-64679bdf5-hpq7q" Feb 25 15:57:11 crc kubenswrapper[4937]: I0225 15:57:11.495257 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 15:57:11 crc kubenswrapper[4937]: I0225 15:57:11.495922 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 15:57:21 crc kubenswrapper[4937]: I0225 15:57:21.412986 4937 scope.go:117] "RemoveContainer" containerID="7d4a5e1a11786b57a50a8e25125cd85bfbc33b6976f9e0ae9c6b9672b97193b3" Feb 25 15:57:21 crc kubenswrapper[4937]: I0225 15:57:21.451956 4937 scope.go:117] "RemoveContainer" containerID="659ed1300e90273684c34bf480b419cb492c3a5b6b5118aaba0086df07266833" Feb 25 15:57:41 crc kubenswrapper[4937]: I0225 15:57:41.494808 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 15:57:41 crc kubenswrapper[4937]: I0225 15:57:41.495503 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 15:57:41 crc kubenswrapper[4937]: I0225 15:57:41.495562 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 15:57:41 crc kubenswrapper[4937]: I0225 15:57:41.496157 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81075b52eae9291c3100b4940aeb92ab7d1af5e10120db2c4bfe3a84072fc970"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 15:57:41 crc kubenswrapper[4937]: I0225 15:57:41.496207 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://81075b52eae9291c3100b4940aeb92ab7d1af5e10120db2c4bfe3a84072fc970" gracePeriod=600 Feb 25 15:57:41 crc kubenswrapper[4937]: I0225 15:57:41.722043 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="81075b52eae9291c3100b4940aeb92ab7d1af5e10120db2c4bfe3a84072fc970" exitCode=0 Feb 25 15:57:41 crc kubenswrapper[4937]: I0225 15:57:41.722148 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"81075b52eae9291c3100b4940aeb92ab7d1af5e10120db2c4bfe3a84072fc970"} Feb 25 15:57:41 crc kubenswrapper[4937]: I0225 15:57:41.722694 4937 scope.go:117] "RemoveContainer" containerID="c9825ea4151fdf2ed4aea12d0ed9b0d4287c1ad0da21c88a4d5b343d65fcffef" Feb 25 15:57:42 crc kubenswrapper[4937]: I0225 15:57:42.733391 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"95cb15cff9f98a839dbce0c2049d37638146e1c360cc03bdc2f2c9958a469258"} Feb 25 15:58:00 crc kubenswrapper[4937]: I0225 15:58:00.140673 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533918-pv26w"] Feb 25 15:58:00 crc kubenswrapper[4937]: I0225 15:58:00.141992 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533918-pv26w" Feb 25 15:58:00 crc kubenswrapper[4937]: I0225 15:58:00.144676 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 15:58:00 crc kubenswrapper[4937]: I0225 15:58:00.145054 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 15:58:00 crc kubenswrapper[4937]: I0225 15:58:00.145926 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 15:58:00 crc kubenswrapper[4937]: I0225 15:58:00.157659 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533918-pv26w"] Feb 25 15:58:00 crc kubenswrapper[4937]: I0225 15:58:00.196569 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q6h7\" (UniqueName: \"kubernetes.io/projected/5fef7f95-32ec-4dd5-b8a0-0868fef33d2f-kube-api-access-9q6h7\") pod \"auto-csr-approver-29533918-pv26w\" (UID: \"5fef7f95-32ec-4dd5-b8a0-0868fef33d2f\") " pod="openshift-infra/auto-csr-approver-29533918-pv26w" Feb 25 15:58:00 crc kubenswrapper[4937]: I0225 15:58:00.297653 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q6h7\" (UniqueName: \"kubernetes.io/projected/5fef7f95-32ec-4dd5-b8a0-0868fef33d2f-kube-api-access-9q6h7\") pod \"auto-csr-approver-29533918-pv26w\" (UID: \"5fef7f95-32ec-4dd5-b8a0-0868fef33d2f\") " pod="openshift-infra/auto-csr-approver-29533918-pv26w" Feb 25 15:58:00 crc kubenswrapper[4937]: I0225 15:58:00.324637 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q6h7\" (UniqueName: \"kubernetes.io/projected/5fef7f95-32ec-4dd5-b8a0-0868fef33d2f-kube-api-access-9q6h7\") pod \"auto-csr-approver-29533918-pv26w\" (UID: \"5fef7f95-32ec-4dd5-b8a0-0868fef33d2f\") " pod="openshift-infra/auto-csr-approver-29533918-pv26w" Feb 25 15:58:00 crc kubenswrapper[4937]: I0225 15:58:00.493128 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533918-pv26w" Feb 25 15:58:00 crc kubenswrapper[4937]: I0225 15:58:00.985088 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533918-pv26w"] Feb 25 15:58:01 crc kubenswrapper[4937]: I0225 15:58:01.855895 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533918-pv26w" event={"ID":"5fef7f95-32ec-4dd5-b8a0-0868fef33d2f","Type":"ContainerStarted","Data":"daa50919093cf25e95f257d076c80b9fe31bc238a2f9a5faa14f6fc740d69341"} Feb 25 15:58:02 crc kubenswrapper[4937]: I0225 15:58:02.865770 4937 generic.go:334] "Generic (PLEG): container finished" podID="5fef7f95-32ec-4dd5-b8a0-0868fef33d2f" containerID="d49c93d817419b11494637c1e12156aa071d1ed713591de7ead4976e7ad12c93" exitCode=0 Feb 25 15:58:02 crc kubenswrapper[4937]: I0225 15:58:02.865854 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533918-pv26w" event={"ID":"5fef7f95-32ec-4dd5-b8a0-0868fef33d2f","Type":"ContainerDied","Data":"d49c93d817419b11494637c1e12156aa071d1ed713591de7ead4976e7ad12c93"} Feb 25 15:58:04 crc kubenswrapper[4937]: I0225 15:58:04.173527 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533918-pv26w" Feb 25 15:58:04 crc kubenswrapper[4937]: I0225 15:58:04.331700 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q6h7\" (UniqueName: \"kubernetes.io/projected/5fef7f95-32ec-4dd5-b8a0-0868fef33d2f-kube-api-access-9q6h7\") pod \"5fef7f95-32ec-4dd5-b8a0-0868fef33d2f\" (UID: \"5fef7f95-32ec-4dd5-b8a0-0868fef33d2f\") " Feb 25 15:58:04 crc kubenswrapper[4937]: I0225 15:58:04.339979 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fef7f95-32ec-4dd5-b8a0-0868fef33d2f-kube-api-access-9q6h7" (OuterVolumeSpecName: "kube-api-access-9q6h7") pod "5fef7f95-32ec-4dd5-b8a0-0868fef33d2f" (UID: "5fef7f95-32ec-4dd5-b8a0-0868fef33d2f"). InnerVolumeSpecName "kube-api-access-9q6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 15:58:04 crc kubenswrapper[4937]: I0225 15:58:04.433636 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q6h7\" (UniqueName: \"kubernetes.io/projected/5fef7f95-32ec-4dd5-b8a0-0868fef33d2f-kube-api-access-9q6h7\") on node \"crc\" DevicePath \"\"" Feb 25 15:58:04 crc kubenswrapper[4937]: I0225 15:58:04.882337 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533918-pv26w" event={"ID":"5fef7f95-32ec-4dd5-b8a0-0868fef33d2f","Type":"ContainerDied","Data":"daa50919093cf25e95f257d076c80b9fe31bc238a2f9a5faa14f6fc740d69341"} Feb 25 15:58:04 crc kubenswrapper[4937]: I0225 15:58:04.882381 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daa50919093cf25e95f257d076c80b9fe31bc238a2f9a5faa14f6fc740d69341" Feb 25 15:58:04 crc kubenswrapper[4937]: I0225 15:58:04.882912 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533918-pv26w" Feb 25 15:58:05 crc kubenswrapper[4937]: I0225 15:58:05.238510 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533912-7rslj"] Feb 25 15:58:05 crc kubenswrapper[4937]: I0225 15:58:05.243265 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533912-7rslj"] Feb 25 15:58:05 crc kubenswrapper[4937]: I0225 15:58:05.381618 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cf00929-ff8b-42c5-96f8-6f02e52372ae" path="/var/lib/kubelet/pods/6cf00929-ff8b-42c5-96f8-6f02e52372ae/volumes" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.139927 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533920-8jlbb"] Feb 25 16:00:00 crc kubenswrapper[4937]: E0225 16:00:00.142838 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fef7f95-32ec-4dd5-b8a0-0868fef33d2f" containerName="oc" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.142974 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fef7f95-32ec-4dd5-b8a0-0868fef33d2f" containerName="oc" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.143236 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fef7f95-32ec-4dd5-b8a0-0868fef33d2f" containerName="oc" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.143946 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533920-8jlbb" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.146788 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw"] Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.146857 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.147065 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.147240 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.147466 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.154984 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.154985 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.169627 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw"] Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.177782 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533920-8jlbb"] Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.300852 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-config-volume\") pod \"collect-profiles-29533920-ksbgw\" (UID: \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.300942 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8jk5\" (UniqueName: \"kubernetes.io/projected/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-kube-api-access-d8jk5\") pod \"collect-profiles-29533920-ksbgw\" (UID: \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.301081 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-secret-volume\") pod \"collect-profiles-29533920-ksbgw\" (UID: \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.301145 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qrfx\" (UniqueName: \"kubernetes.io/projected/470f17dd-ee35-4eb9-b7d2-2815acdc1b9c-kube-api-access-2qrfx\") pod \"auto-csr-approver-29533920-8jlbb\" (UID: \"470f17dd-ee35-4eb9-b7d2-2815acdc1b9c\") " pod="openshift-infra/auto-csr-approver-29533920-8jlbb" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.402376 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qrfx\" (UniqueName: \"kubernetes.io/projected/470f17dd-ee35-4eb9-b7d2-2815acdc1b9c-kube-api-access-2qrfx\") pod \"auto-csr-approver-29533920-8jlbb\" (UID: \"470f17dd-ee35-4eb9-b7d2-2815acdc1b9c\") " pod="openshift-infra/auto-csr-approver-29533920-8jlbb" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.402476 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-config-volume\") pod \"collect-profiles-29533920-ksbgw\" (UID: \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.402520 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8jk5\" (UniqueName: \"kubernetes.io/projected/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-kube-api-access-d8jk5\") pod \"collect-profiles-29533920-ksbgw\" (UID: \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.402552 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-secret-volume\") pod \"collect-profiles-29533920-ksbgw\" (UID: \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.403436 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-config-volume\") pod \"collect-profiles-29533920-ksbgw\" (UID: \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.414186 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-secret-volume\") pod \"collect-profiles-29533920-ksbgw\" (UID: \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.421199 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8jk5\" (UniqueName: \"kubernetes.io/projected/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-kube-api-access-d8jk5\") pod \"collect-profiles-29533920-ksbgw\" (UID: \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.435399 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qrfx\" (UniqueName: \"kubernetes.io/projected/470f17dd-ee35-4eb9-b7d2-2815acdc1b9c-kube-api-access-2qrfx\") pod \"auto-csr-approver-29533920-8jlbb\" (UID: \"470f17dd-ee35-4eb9-b7d2-2815acdc1b9c\") " pod="openshift-infra/auto-csr-approver-29533920-8jlbb" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.479793 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533920-8jlbb" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.491647 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.764979 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw"] Feb 25 16:00:00 crc kubenswrapper[4937]: W0225 16:00:00.771648 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250e9ce0_aa13_4ba3_ab3f_b6fd566d7f9f.slice/crio-8d1f459925276a65b50f0504c20e5ea5366bbc1214a0b3e1aebe4aaf3ee55a27 WatchSource:0}: Error finding container 8d1f459925276a65b50f0504c20e5ea5366bbc1214a0b3e1aebe4aaf3ee55a27: Status 404 returned error can't find the container with id 8d1f459925276a65b50f0504c20e5ea5366bbc1214a0b3e1aebe4aaf3ee55a27 Feb 25 16:00:00 crc kubenswrapper[4937]: I0225 16:00:00.950295 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533920-8jlbb"] Feb 25 16:00:00 crc kubenswrapper[4937]: W0225 16:00:00.954342 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod470f17dd_ee35_4eb9_b7d2_2815acdc1b9c.slice/crio-07e055a04ec48c2122ad8d215be42fac735714ac509729aa96a4dd8e74269ff0 WatchSource:0}: Error finding container 07e055a04ec48c2122ad8d215be42fac735714ac509729aa96a4dd8e74269ff0: Status 404 returned error can't find the container with id 07e055a04ec48c2122ad8d215be42fac735714ac509729aa96a4dd8e74269ff0 Feb 25 16:00:01 crc kubenswrapper[4937]: I0225 16:00:01.531511 4937 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 25 16:00:01 crc kubenswrapper[4937]: I0225 16:00:01.683989 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533920-8jlbb" event={"ID":"470f17dd-ee35-4eb9-b7d2-2815acdc1b9c","Type":"ContainerStarted","Data":"07e055a04ec48c2122ad8d215be42fac735714ac509729aa96a4dd8e74269ff0"} Feb 25 16:00:01 crc kubenswrapper[4937]: I0225 16:00:01.686018 4937 generic.go:334] "Generic (PLEG): container finished" podID="250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f" containerID="98afa0be2a7238676648049bfd8bad309c32dec99502b3a71d7d6477a73bc0ea" exitCode=0 Feb 25 16:00:01 crc kubenswrapper[4937]: I0225 16:00:01.686074 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" event={"ID":"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f","Type":"ContainerDied","Data":"98afa0be2a7238676648049bfd8bad309c32dec99502b3a71d7d6477a73bc0ea"} Feb 25 16:00:01 crc kubenswrapper[4937]: I0225 16:00:01.686111 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" event={"ID":"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f","Type":"ContainerStarted","Data":"8d1f459925276a65b50f0504c20e5ea5366bbc1214a0b3e1aebe4aaf3ee55a27"} Feb 25 16:00:03 crc kubenswrapper[4937]: I0225 16:00:03.002860 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" Feb 25 16:00:03 crc kubenswrapper[4937]: I0225 16:00:03.142097 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-config-volume\") pod \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\" (UID: \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\") " Feb 25 16:00:03 crc kubenswrapper[4937]: I0225 16:00:03.142208 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8jk5\" (UniqueName: \"kubernetes.io/projected/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-kube-api-access-d8jk5\") pod \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\" (UID: \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\") " Feb 25 16:00:03 crc kubenswrapper[4937]: I0225 16:00:03.142292 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-secret-volume\") pod \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\" (UID: \"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f\") " Feb 25 16:00:03 crc kubenswrapper[4937]: I0225 16:00:03.143256 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-config-volume" (OuterVolumeSpecName: "config-volume") pod "250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f" (UID: "250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:00:03 crc kubenswrapper[4937]: I0225 16:00:03.150945 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-kube-api-access-d8jk5" (OuterVolumeSpecName: "kube-api-access-d8jk5") pod "250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f" (UID: "250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f"). InnerVolumeSpecName "kube-api-access-d8jk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:00:03 crc kubenswrapper[4937]: I0225 16:00:03.152752 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f" (UID: "250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:00:03 crc kubenswrapper[4937]: I0225 16:00:03.244427 4937 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 16:00:03 crc kubenswrapper[4937]: I0225 16:00:03.244961 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8jk5\" (UniqueName: \"kubernetes.io/projected/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-kube-api-access-d8jk5\") on node \"crc\" DevicePath \"\"" Feb 25 16:00:03 crc kubenswrapper[4937]: I0225 16:00:03.244991 4937 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 16:00:03 crc kubenswrapper[4937]: I0225 16:00:03.701139 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" event={"ID":"250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f","Type":"ContainerDied","Data":"8d1f459925276a65b50f0504c20e5ea5366bbc1214a0b3e1aebe4aaf3ee55a27"} Feb 25 16:00:03 crc kubenswrapper[4937]: I0225 16:00:03.701723 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d1f459925276a65b50f0504c20e5ea5366bbc1214a0b3e1aebe4aaf3ee55a27" Feb 25 16:00:03 crc kubenswrapper[4937]: I0225 16:00:03.701568 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw" Feb 25 16:00:11 crc kubenswrapper[4937]: I0225 16:00:11.494584 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:00:11 crc kubenswrapper[4937]: I0225 16:00:11.495190 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:00:17 crc kubenswrapper[4937]: I0225 16:00:17.805793 4937 generic.go:334] "Generic (PLEG): container finished" podID="470f17dd-ee35-4eb9-b7d2-2815acdc1b9c" containerID="92e63649e314d4809a104909b649a75fea923aa2e67fb276105f8f47f22b467d" exitCode=0 Feb 25 16:00:17 crc kubenswrapper[4937]: I0225 16:00:17.805954 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533920-8jlbb" event={"ID":"470f17dd-ee35-4eb9-b7d2-2815acdc1b9c","Type":"ContainerDied","Data":"92e63649e314d4809a104909b649a75fea923aa2e67fb276105f8f47f22b467d"} Feb 25 16:00:19 crc kubenswrapper[4937]: I0225 16:00:19.100886 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533920-8jlbb" Feb 25 16:00:19 crc kubenswrapper[4937]: I0225 16:00:19.265524 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qrfx\" (UniqueName: \"kubernetes.io/projected/470f17dd-ee35-4eb9-b7d2-2815acdc1b9c-kube-api-access-2qrfx\") pod \"470f17dd-ee35-4eb9-b7d2-2815acdc1b9c\" (UID: \"470f17dd-ee35-4eb9-b7d2-2815acdc1b9c\") " Feb 25 16:00:19 crc kubenswrapper[4937]: I0225 16:00:19.272277 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/470f17dd-ee35-4eb9-b7d2-2815acdc1b9c-kube-api-access-2qrfx" (OuterVolumeSpecName: "kube-api-access-2qrfx") pod "470f17dd-ee35-4eb9-b7d2-2815acdc1b9c" (UID: "470f17dd-ee35-4eb9-b7d2-2815acdc1b9c"). InnerVolumeSpecName "kube-api-access-2qrfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:00:19 crc kubenswrapper[4937]: I0225 16:00:19.367144 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qrfx\" (UniqueName: \"kubernetes.io/projected/470f17dd-ee35-4eb9-b7d2-2815acdc1b9c-kube-api-access-2qrfx\") on node \"crc\" DevicePath \"\"" Feb 25 16:00:19 crc kubenswrapper[4937]: I0225 16:00:19.835851 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533920-8jlbb" event={"ID":"470f17dd-ee35-4eb9-b7d2-2815acdc1b9c","Type":"ContainerDied","Data":"07e055a04ec48c2122ad8d215be42fac735714ac509729aa96a4dd8e74269ff0"} Feb 25 16:00:19 crc kubenswrapper[4937]: I0225 16:00:19.835903 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07e055a04ec48c2122ad8d215be42fac735714ac509729aa96a4dd8e74269ff0" Feb 25 16:00:19 crc kubenswrapper[4937]: I0225 16:00:19.835906 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533920-8jlbb" Feb 25 16:00:20 crc kubenswrapper[4937]: I0225 16:00:20.188012 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533914-nfxdw"] Feb 25 16:00:20 crc kubenswrapper[4937]: I0225 16:00:20.195214 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533914-nfxdw"] Feb 25 16:00:21 crc kubenswrapper[4937]: I0225 16:00:21.378601 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e90b54e9-9441-4afc-a28d-6febea150a04" path="/var/lib/kubelet/pods/e90b54e9-9441-4afc-a28d-6febea150a04/volumes" Feb 25 16:00:41 crc kubenswrapper[4937]: I0225 16:00:41.494363 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:00:41 crc kubenswrapper[4937]: I0225 16:00:41.495064 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.489004 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rgfct"] Feb 25 16:00:54 crc kubenswrapper[4937]: E0225 16:00:54.489884 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f" containerName="collect-profiles" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.489906 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f" containerName="collect-profiles" Feb 25 16:00:54 crc kubenswrapper[4937]: E0225 16:00:54.489935 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470f17dd-ee35-4eb9-b7d2-2815acdc1b9c" containerName="oc" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.489948 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="470f17dd-ee35-4eb9-b7d2-2815acdc1b9c" containerName="oc" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.490104 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f" containerName="collect-profiles" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.490174 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="470f17dd-ee35-4eb9-b7d2-2815acdc1b9c" containerName="oc" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.490702 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.510101 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rgfct"] Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.691785 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5bf5a58-5557-4250-858e-a36df6f9b825-registry-certificates\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.691863 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5bf5a58-5557-4250-858e-a36df6f9b825-trusted-ca\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.691892 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wzbd\" (UniqueName: \"kubernetes.io/projected/c5bf5a58-5557-4250-858e-a36df6f9b825-kube-api-access-9wzbd\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.691943 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.691977 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5bf5a58-5557-4250-858e-a36df6f9b825-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.692007 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5bf5a58-5557-4250-858e-a36df6f9b825-registry-tls\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.692038 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5bf5a58-5557-4250-858e-a36df6f9b825-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.692111 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5bf5a58-5557-4250-858e-a36df6f9b825-bound-sa-token\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.726005 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.793410 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5bf5a58-5557-4250-858e-a36df6f9b825-bound-sa-token\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.793546 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5bf5a58-5557-4250-858e-a36df6f9b825-registry-certificates\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.793616 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5bf5a58-5557-4250-858e-a36df6f9b825-trusted-ca\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.793652 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wzbd\" (UniqueName: \"kubernetes.io/projected/c5bf5a58-5557-4250-858e-a36df6f9b825-kube-api-access-9wzbd\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.793726 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5bf5a58-5557-4250-858e-a36df6f9b825-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.793772 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5bf5a58-5557-4250-858e-a36df6f9b825-registry-tls\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.793807 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5bf5a58-5557-4250-858e-a36df6f9b825-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.794671 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5bf5a58-5557-4250-858e-a36df6f9b825-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.795011 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5bf5a58-5557-4250-858e-a36df6f9b825-trusted-ca\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.795199 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5bf5a58-5557-4250-858e-a36df6f9b825-registry-certificates\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.803587 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5bf5a58-5557-4250-858e-a36df6f9b825-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.806830 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5bf5a58-5557-4250-858e-a36df6f9b825-registry-tls\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.817132 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wzbd\" (UniqueName: \"kubernetes.io/projected/c5bf5a58-5557-4250-858e-a36df6f9b825-kube-api-access-9wzbd\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:54 crc kubenswrapper[4937]: I0225 16:00:54.825649 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5bf5a58-5557-4250-858e-a36df6f9b825-bound-sa-token\") pod \"image-registry-66df7c8f76-rgfct\" (UID: \"c5bf5a58-5557-4250-858e-a36df6f9b825\") " pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:55 crc kubenswrapper[4937]: I0225 16:00:55.108901 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:55 crc kubenswrapper[4937]: I0225 16:00:55.357118 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rgfct"] Feb 25 16:00:56 crc kubenswrapper[4937]: I0225 16:00:56.084386 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" event={"ID":"c5bf5a58-5557-4250-858e-a36df6f9b825","Type":"ContainerStarted","Data":"5a768e647fc93c4e86307feebc56da1c26d085a1cad49cab2d5c6400dc5c1e29"} Feb 25 16:00:56 crc kubenswrapper[4937]: I0225 16:00:56.084502 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" event={"ID":"c5bf5a58-5557-4250-858e-a36df6f9b825","Type":"ContainerStarted","Data":"2e5e6e9cd43d0f3449d804cf22551fa519e4419a918d3dde34827da402a54fc5"} Feb 25 16:00:56 crc kubenswrapper[4937]: I0225 16:00:56.084602 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:00:56 crc kubenswrapper[4937]: I0225 16:00:56.108363 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" podStartSLOduration=2.108327328 podStartE2EDuration="2.108327328s" podCreationTimestamp="2026-02-25 16:00:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:00:56.10722307 +0000 UTC m=+907.120614970" watchObservedRunningTime="2026-02-25 16:00:56.108327328 +0000 UTC m=+907.121719258" Feb 25 16:01:11 crc kubenswrapper[4937]: I0225 16:01:11.495031 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:01:11 crc kubenswrapper[4937]: I0225 16:01:11.495744 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:01:11 crc kubenswrapper[4937]: I0225 16:01:11.495807 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 16:01:11 crc kubenswrapper[4937]: I0225 16:01:11.496612 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"95cb15cff9f98a839dbce0c2049d37638146e1c360cc03bdc2f2c9958a469258"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 16:01:11 crc kubenswrapper[4937]: I0225 16:01:11.496696 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://95cb15cff9f98a839dbce0c2049d37638146e1c360cc03bdc2f2c9958a469258" gracePeriod=600 Feb 25 16:01:12 crc kubenswrapper[4937]: I0225 16:01:12.203790 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="95cb15cff9f98a839dbce0c2049d37638146e1c360cc03bdc2f2c9958a469258" exitCode=0 Feb 25 16:01:12 crc kubenswrapper[4937]: I0225 16:01:12.203880 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"95cb15cff9f98a839dbce0c2049d37638146e1c360cc03bdc2f2c9958a469258"} Feb 25 16:01:12 crc kubenswrapper[4937]: I0225 16:01:12.203956 4937 scope.go:117] "RemoveContainer" containerID="81075b52eae9291c3100b4940aeb92ab7d1af5e10120db2c4bfe3a84072fc970" Feb 25 16:01:13 crc kubenswrapper[4937]: I0225 16:01:13.212339 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"a3de247f04ff3abf939866313cfef1da7c2e6ae7d14d3da3ecda7ba81bfc35f7"} Feb 25 16:01:15 crc kubenswrapper[4937]: I0225 16:01:15.114714 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-rgfct" Feb 25 16:01:15 crc kubenswrapper[4937]: I0225 16:01:15.197320 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pdnqk"] Feb 25 16:01:21 crc kubenswrapper[4937]: I0225 16:01:21.600855 4937 scope.go:117] "RemoveContainer" containerID="51af0abc0a79790cba8e41933bf07cb316171e1d1e922535f0b6f409bf32cad5" Feb 25 16:01:21 crc kubenswrapper[4937]: I0225 16:01:21.653824 4937 scope.go:117] "RemoveContainer" containerID="b1343e02223c1e86e17c58240086658a4b55da57ea15d8bda5b3e01b5c902978" Feb 25 16:01:40 crc kubenswrapper[4937]: I0225 16:01:40.254293 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" podUID="969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf" containerName="registry" containerID="cri-o://72c109c539d2049284c53056ca9ffccf71c7af0d1ade28d88ac3eda8db25ba81" gracePeriod=30 Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.254612 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.418300 4937 generic.go:334] "Generic (PLEG): container finished" podID="969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf" containerID="72c109c539d2049284c53056ca9ffccf71c7af0d1ade28d88ac3eda8db25ba81" exitCode=0 Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.418389 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" event={"ID":"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf","Type":"ContainerDied","Data":"72c109c539d2049284c53056ca9ffccf71c7af0d1ade28d88ac3eda8db25ba81"} Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.418445 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.418477 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pdnqk" event={"ID":"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf","Type":"ContainerDied","Data":"00edcc4b3894954d32ede2a937efe6772921ea5d60c8fc1214cf6b53e06c50d5"} Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.418533 4937 scope.go:117] "RemoveContainer" containerID="72c109c539d2049284c53056ca9ffccf71c7af0d1ade28d88ac3eda8db25ba81" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.425391 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-registry-certificates\") pod \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.425518 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-bound-sa-token\") pod \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.425648 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-trusted-ca\") pod \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.425729 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-ca-trust-extracted\") pod \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.425787 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-installation-pull-secrets\") pod \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.425835 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng78b\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-kube-api-access-ng78b\") pod \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.425931 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-registry-tls\") pod \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.426148 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\" (UID: \"969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf\") " Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.427563 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.428662 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.434478 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.434825 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.434928 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-kube-api-access-ng78b" (OuterVolumeSpecName: "kube-api-access-ng78b") pod "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf"). InnerVolumeSpecName "kube-api-access-ng78b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.435378 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.441354 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.450434 4937 scope.go:117] "RemoveContainer" containerID="72c109c539d2049284c53056ca9ffccf71c7af0d1ade28d88ac3eda8db25ba81" Feb 25 16:01:41 crc kubenswrapper[4937]: E0225 16:01:41.451093 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72c109c539d2049284c53056ca9ffccf71c7af0d1ade28d88ac3eda8db25ba81\": container with ID starting with 72c109c539d2049284c53056ca9ffccf71c7af0d1ade28d88ac3eda8db25ba81 not found: ID does not exist" containerID="72c109c539d2049284c53056ca9ffccf71c7af0d1ade28d88ac3eda8db25ba81" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.451148 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72c109c539d2049284c53056ca9ffccf71c7af0d1ade28d88ac3eda8db25ba81"} err="failed to get container status \"72c109c539d2049284c53056ca9ffccf71c7af0d1ade28d88ac3eda8db25ba81\": rpc error: code = NotFound desc = could not find container \"72c109c539d2049284c53056ca9ffccf71c7af0d1ade28d88ac3eda8db25ba81\": container with ID starting with 72c109c539d2049284c53056ca9ffccf71c7af0d1ade28d88ac3eda8db25ba81 not found: ID does not exist" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.457179 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf" (UID: "969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.528013 4937 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.528062 4937 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.528081 4937 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.528096 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng78b\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-kube-api-access-ng78b\") on node \"crc\" DevicePath \"\"" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.528113 4937 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.528129 4937 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.528145 4937 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.772161 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pdnqk"] Feb 25 16:01:41 crc kubenswrapper[4937]: I0225 16:01:41.776343 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pdnqk"] Feb 25 16:01:43 crc kubenswrapper[4937]: I0225 16:01:43.379058 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf" path="/var/lib/kubelet/pods/969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf/volumes" Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.718424 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7"] Feb 25 16:01:56 crc kubenswrapper[4937]: E0225 16:01:56.719210 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf" containerName="registry" Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.719225 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf" containerName="registry" Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.719348 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="969c8ed5-2ea2-4966-9fd6-954ab7ef8ccf" containerName="registry" Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.720279 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.721955 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.730259 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7"] Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.854350 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b58d852-ef69-4a94-8e1b-8892612ff7aa-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7\" (UID: \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.854457 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rrv7\" (UniqueName: \"kubernetes.io/projected/6b58d852-ef69-4a94-8e1b-8892612ff7aa-kube-api-access-2rrv7\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7\" (UID: \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.854585 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b58d852-ef69-4a94-8e1b-8892612ff7aa-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7\" (UID: \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.955598 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rrv7\" (UniqueName: \"kubernetes.io/projected/6b58d852-ef69-4a94-8e1b-8892612ff7aa-kube-api-access-2rrv7\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7\" (UID: \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.955657 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b58d852-ef69-4a94-8e1b-8892612ff7aa-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7\" (UID: \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.955702 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b58d852-ef69-4a94-8e1b-8892612ff7aa-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7\" (UID: \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.956140 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b58d852-ef69-4a94-8e1b-8892612ff7aa-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7\" (UID: \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.956291 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b58d852-ef69-4a94-8e1b-8892612ff7aa-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7\" (UID: \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" Feb 25 16:01:56 crc kubenswrapper[4937]: I0225 16:01:56.975424 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rrv7\" (UniqueName: \"kubernetes.io/projected/6b58d852-ef69-4a94-8e1b-8892612ff7aa-kube-api-access-2rrv7\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7\" (UID: \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" Feb 25 16:01:57 crc kubenswrapper[4937]: I0225 16:01:57.057184 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" Feb 25 16:01:57 crc kubenswrapper[4937]: I0225 16:01:57.309185 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7"] Feb 25 16:01:57 crc kubenswrapper[4937]: I0225 16:01:57.513053 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" event={"ID":"6b58d852-ef69-4a94-8e1b-8892612ff7aa","Type":"ContainerStarted","Data":"d30bf84d7e0cb60b0f3c671630dc7e80012b4916beb3e187a5237b75a216cdef"} Feb 25 16:01:57 crc kubenswrapper[4937]: I0225 16:01:57.513424 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" event={"ID":"6b58d852-ef69-4a94-8e1b-8892612ff7aa","Type":"ContainerStarted","Data":"bb55e4f7f0064ea4376caebaae78b4a82982488899da464bd3a1c1309caed73d"} Feb 25 16:01:58 crc kubenswrapper[4937]: I0225 16:01:58.521468 4937 generic.go:334] "Generic (PLEG): container finished" podID="6b58d852-ef69-4a94-8e1b-8892612ff7aa" containerID="d30bf84d7e0cb60b0f3c671630dc7e80012b4916beb3e187a5237b75a216cdef" exitCode=0 Feb 25 16:01:58 crc kubenswrapper[4937]: I0225 16:01:58.521574 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" event={"ID":"6b58d852-ef69-4a94-8e1b-8892612ff7aa","Type":"ContainerDied","Data":"d30bf84d7e0cb60b0f3c671630dc7e80012b4916beb3e187a5237b75a216cdef"} Feb 25 16:01:58 crc kubenswrapper[4937]: I0225 16:01:58.524034 4937 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 16:01:59 crc kubenswrapper[4937]: I0225 16:01:59.069867 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q6pxs"] Feb 25 16:01:59 crc kubenswrapper[4937]: I0225 16:01:59.071187 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:01:59 crc kubenswrapper[4937]: I0225 16:01:59.094330 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q6pxs"] Feb 25 16:01:59 crc kubenswrapper[4937]: I0225 16:01:59.184055 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17373bf5-4311-4847-bfbf-3b346a214d8c-catalog-content\") pod \"redhat-operators-q6pxs\" (UID: \"17373bf5-4311-4847-bfbf-3b346a214d8c\") " pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:01:59 crc kubenswrapper[4937]: I0225 16:01:59.184114 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17373bf5-4311-4847-bfbf-3b346a214d8c-utilities\") pod \"redhat-operators-q6pxs\" (UID: \"17373bf5-4311-4847-bfbf-3b346a214d8c\") " pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:01:59 crc kubenswrapper[4937]: I0225 16:01:59.184141 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bbst\" (UniqueName: \"kubernetes.io/projected/17373bf5-4311-4847-bfbf-3b346a214d8c-kube-api-access-9bbst\") pod \"redhat-operators-q6pxs\" (UID: \"17373bf5-4311-4847-bfbf-3b346a214d8c\") " pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:01:59 crc kubenswrapper[4937]: I0225 16:01:59.285212 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17373bf5-4311-4847-bfbf-3b346a214d8c-catalog-content\") pod \"redhat-operators-q6pxs\" (UID: \"17373bf5-4311-4847-bfbf-3b346a214d8c\") " pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:01:59 crc kubenswrapper[4937]: I0225 16:01:59.285265 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17373bf5-4311-4847-bfbf-3b346a214d8c-utilities\") pod \"redhat-operators-q6pxs\" (UID: \"17373bf5-4311-4847-bfbf-3b346a214d8c\") " pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:01:59 crc kubenswrapper[4937]: I0225 16:01:59.285297 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bbst\" (UniqueName: \"kubernetes.io/projected/17373bf5-4311-4847-bfbf-3b346a214d8c-kube-api-access-9bbst\") pod \"redhat-operators-q6pxs\" (UID: \"17373bf5-4311-4847-bfbf-3b346a214d8c\") " pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:01:59 crc kubenswrapper[4937]: I0225 16:01:59.285857 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17373bf5-4311-4847-bfbf-3b346a214d8c-utilities\") pod \"redhat-operators-q6pxs\" (UID: \"17373bf5-4311-4847-bfbf-3b346a214d8c\") " pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:01:59 crc kubenswrapper[4937]: I0225 16:01:59.285958 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17373bf5-4311-4847-bfbf-3b346a214d8c-catalog-content\") pod \"redhat-operators-q6pxs\" (UID: \"17373bf5-4311-4847-bfbf-3b346a214d8c\") " pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:01:59 crc kubenswrapper[4937]: I0225 16:01:59.304361 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bbst\" (UniqueName: \"kubernetes.io/projected/17373bf5-4311-4847-bfbf-3b346a214d8c-kube-api-access-9bbst\") pod \"redhat-operators-q6pxs\" (UID: \"17373bf5-4311-4847-bfbf-3b346a214d8c\") " pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:01:59 crc kubenswrapper[4937]: I0225 16:01:59.407284 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:01:59 crc kubenswrapper[4937]: I0225 16:01:59.614309 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q6pxs"] Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.133266 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533922-nkq2g"] Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.134140 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533922-nkq2g" Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.135662 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.136100 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.136211 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.143159 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533922-nkq2g"] Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.296892 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jch8\" (UniqueName: \"kubernetes.io/projected/2d6a5911-b7a5-4523-bead-543b0a0ccdcc-kube-api-access-2jch8\") pod \"auto-csr-approver-29533922-nkq2g\" (UID: \"2d6a5911-b7a5-4523-bead-543b0a0ccdcc\") " pod="openshift-infra/auto-csr-approver-29533922-nkq2g" Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.397687 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jch8\" (UniqueName: \"kubernetes.io/projected/2d6a5911-b7a5-4523-bead-543b0a0ccdcc-kube-api-access-2jch8\") pod \"auto-csr-approver-29533922-nkq2g\" (UID: \"2d6a5911-b7a5-4523-bead-543b0a0ccdcc\") " pod="openshift-infra/auto-csr-approver-29533922-nkq2g" Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.428408 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jch8\" (UniqueName: \"kubernetes.io/projected/2d6a5911-b7a5-4523-bead-543b0a0ccdcc-kube-api-access-2jch8\") pod \"auto-csr-approver-29533922-nkq2g\" (UID: \"2d6a5911-b7a5-4523-bead-543b0a0ccdcc\") " pod="openshift-infra/auto-csr-approver-29533922-nkq2g" Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.501632 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533922-nkq2g" Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.533356 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" event={"ID":"6b58d852-ef69-4a94-8e1b-8892612ff7aa","Type":"ContainerStarted","Data":"54082723fb8c3114ab2f68740c8c34b0baf564aba69819434e2baae54971eb0b"} Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.535160 4937 generic.go:334] "Generic (PLEG): container finished" podID="17373bf5-4311-4847-bfbf-3b346a214d8c" containerID="2ba8f41d8fda95489d38ef00cafd16ede50d5c099ce5bf393f4739f4478d0832" exitCode=0 Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.535191 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6pxs" event={"ID":"17373bf5-4311-4847-bfbf-3b346a214d8c","Type":"ContainerDied","Data":"2ba8f41d8fda95489d38ef00cafd16ede50d5c099ce5bf393f4739f4478d0832"} Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.535210 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6pxs" event={"ID":"17373bf5-4311-4847-bfbf-3b346a214d8c","Type":"ContainerStarted","Data":"504a6beed0bcda7ee319432acfa600b9f29a65857a186c2b14ace1390bd2da32"} Feb 25 16:02:00 crc kubenswrapper[4937]: I0225 16:02:00.670146 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533922-nkq2g"] Feb 25 16:02:00 crc kubenswrapper[4937]: W0225 16:02:00.674552 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d6a5911_b7a5_4523_bead_543b0a0ccdcc.slice/crio-548c716b4329385a3f704d2c4cd1e220053737a514950138af7a338e2a1a89e0 WatchSource:0}: Error finding container 548c716b4329385a3f704d2c4cd1e220053737a514950138af7a338e2a1a89e0: Status 404 returned error can't find the container with id 548c716b4329385a3f704d2c4cd1e220053737a514950138af7a338e2a1a89e0 Feb 25 16:02:01 crc kubenswrapper[4937]: I0225 16:02:01.544813 4937 generic.go:334] "Generic (PLEG): container finished" podID="6b58d852-ef69-4a94-8e1b-8892612ff7aa" containerID="54082723fb8c3114ab2f68740c8c34b0baf564aba69819434e2baae54971eb0b" exitCode=0 Feb 25 16:02:01 crc kubenswrapper[4937]: I0225 16:02:01.544892 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" event={"ID":"6b58d852-ef69-4a94-8e1b-8892612ff7aa","Type":"ContainerDied","Data":"54082723fb8c3114ab2f68740c8c34b0baf564aba69819434e2baae54971eb0b"} Feb 25 16:02:01 crc kubenswrapper[4937]: I0225 16:02:01.546506 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533922-nkq2g" event={"ID":"2d6a5911-b7a5-4523-bead-543b0a0ccdcc","Type":"ContainerStarted","Data":"548c716b4329385a3f704d2c4cd1e220053737a514950138af7a338e2a1a89e0"} Feb 25 16:02:02 crc kubenswrapper[4937]: I0225 16:02:02.553582 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533922-nkq2g" event={"ID":"2d6a5911-b7a5-4523-bead-543b0a0ccdcc","Type":"ContainerStarted","Data":"8792f524feedae4df73e9bab681c0463dc904dabe95f74cc4a2f01194714eded"} Feb 25 16:02:02 crc kubenswrapper[4937]: I0225 16:02:02.556696 4937 generic.go:334] "Generic (PLEG): container finished" podID="6b58d852-ef69-4a94-8e1b-8892612ff7aa" containerID="f6129bf61254d5426b82b6581e3379b77ec23932d257eb83ee4adaa484d529bb" exitCode=0 Feb 25 16:02:02 crc kubenswrapper[4937]: I0225 16:02:02.556756 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" event={"ID":"6b58d852-ef69-4a94-8e1b-8892612ff7aa","Type":"ContainerDied","Data":"f6129bf61254d5426b82b6581e3379b77ec23932d257eb83ee4adaa484d529bb"} Feb 25 16:02:02 crc kubenswrapper[4937]: I0225 16:02:02.559098 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6pxs" event={"ID":"17373bf5-4311-4847-bfbf-3b346a214d8c","Type":"ContainerStarted","Data":"8ff9e6538a27044eaa3a44f33f87566cffc7cae195ab08b6b6f0eb611c987241"} Feb 25 16:02:03 crc kubenswrapper[4937]: I0225 16:02:03.569929 4937 generic.go:334] "Generic (PLEG): container finished" podID="2d6a5911-b7a5-4523-bead-543b0a0ccdcc" containerID="8792f524feedae4df73e9bab681c0463dc904dabe95f74cc4a2f01194714eded" exitCode=0 Feb 25 16:02:03 crc kubenswrapper[4937]: I0225 16:02:03.570040 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533922-nkq2g" event={"ID":"2d6a5911-b7a5-4523-bead-543b0a0ccdcc","Type":"ContainerDied","Data":"8792f524feedae4df73e9bab681c0463dc904dabe95f74cc4a2f01194714eded"} Feb 25 16:02:03 crc kubenswrapper[4937]: I0225 16:02:03.990608 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" Feb 25 16:02:04 crc kubenswrapper[4937]: E0225 16:02:04.019553 4937 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17373bf5_4311_4847_bfbf_3b346a214d8c.slice/crio-conmon-8ff9e6538a27044eaa3a44f33f87566cffc7cae195ab08b6b6f0eb611c987241.scope\": RecentStats: unable to find data in memory cache]" Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.155707 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rrv7\" (UniqueName: \"kubernetes.io/projected/6b58d852-ef69-4a94-8e1b-8892612ff7aa-kube-api-access-2rrv7\") pod \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\" (UID: \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\") " Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.156017 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b58d852-ef69-4a94-8e1b-8892612ff7aa-bundle\") pod \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\" (UID: \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\") " Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.156122 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b58d852-ef69-4a94-8e1b-8892612ff7aa-util\") pod \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\" (UID: \"6b58d852-ef69-4a94-8e1b-8892612ff7aa\") " Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.160332 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b58d852-ef69-4a94-8e1b-8892612ff7aa-bundle" (OuterVolumeSpecName: "bundle") pod "6b58d852-ef69-4a94-8e1b-8892612ff7aa" (UID: "6b58d852-ef69-4a94-8e1b-8892612ff7aa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.164815 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b58d852-ef69-4a94-8e1b-8892612ff7aa-kube-api-access-2rrv7" (OuterVolumeSpecName: "kube-api-access-2rrv7") pod "6b58d852-ef69-4a94-8e1b-8892612ff7aa" (UID: "6b58d852-ef69-4a94-8e1b-8892612ff7aa"). InnerVolumeSpecName "kube-api-access-2rrv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.171076 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b58d852-ef69-4a94-8e1b-8892612ff7aa-util" (OuterVolumeSpecName: "util") pod "6b58d852-ef69-4a94-8e1b-8892612ff7aa" (UID: "6b58d852-ef69-4a94-8e1b-8892612ff7aa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.258109 4937 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b58d852-ef69-4a94-8e1b-8892612ff7aa-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.258409 4937 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b58d852-ef69-4a94-8e1b-8892612ff7aa-util\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.258515 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rrv7\" (UniqueName: \"kubernetes.io/projected/6b58d852-ef69-4a94-8e1b-8892612ff7aa-kube-api-access-2rrv7\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.578254 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" event={"ID":"6b58d852-ef69-4a94-8e1b-8892612ff7aa","Type":"ContainerDied","Data":"bb55e4f7f0064ea4376caebaae78b4a82982488899da464bd3a1c1309caed73d"} Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.578301 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb55e4f7f0064ea4376caebaae78b4a82982488899da464bd3a1c1309caed73d" Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.578347 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7" Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.580893 4937 generic.go:334] "Generic (PLEG): container finished" podID="17373bf5-4311-4847-bfbf-3b346a214d8c" containerID="8ff9e6538a27044eaa3a44f33f87566cffc7cae195ab08b6b6f0eb611c987241" exitCode=0 Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.580936 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6pxs" event={"ID":"17373bf5-4311-4847-bfbf-3b346a214d8c","Type":"ContainerDied","Data":"8ff9e6538a27044eaa3a44f33f87566cffc7cae195ab08b6b6f0eb611c987241"} Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.833223 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533922-nkq2g" Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.966689 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jch8\" (UniqueName: \"kubernetes.io/projected/2d6a5911-b7a5-4523-bead-543b0a0ccdcc-kube-api-access-2jch8\") pod \"2d6a5911-b7a5-4523-bead-543b0a0ccdcc\" (UID: \"2d6a5911-b7a5-4523-bead-543b0a0ccdcc\") " Feb 25 16:02:04 crc kubenswrapper[4937]: I0225 16:02:04.972763 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d6a5911-b7a5-4523-bead-543b0a0ccdcc-kube-api-access-2jch8" (OuterVolumeSpecName: "kube-api-access-2jch8") pod "2d6a5911-b7a5-4523-bead-543b0a0ccdcc" (UID: "2d6a5911-b7a5-4523-bead-543b0a0ccdcc"). InnerVolumeSpecName "kube-api-access-2jch8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:02:05 crc kubenswrapper[4937]: I0225 16:02:05.068320 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jch8\" (UniqueName: \"kubernetes.io/projected/2d6a5911-b7a5-4523-bead-543b0a0ccdcc-kube-api-access-2jch8\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:05 crc kubenswrapper[4937]: I0225 16:02:05.588751 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533922-nkq2g" event={"ID":"2d6a5911-b7a5-4523-bead-543b0a0ccdcc","Type":"ContainerDied","Data":"548c716b4329385a3f704d2c4cd1e220053737a514950138af7a338e2a1a89e0"} Feb 25 16:02:05 crc kubenswrapper[4937]: I0225 16:02:05.588810 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="548c716b4329385a3f704d2c4cd1e220053737a514950138af7a338e2a1a89e0" Feb 25 16:02:05 crc kubenswrapper[4937]: I0225 16:02:05.588895 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533922-nkq2g" Feb 25 16:02:05 crc kubenswrapper[4937]: I0225 16:02:05.594058 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6pxs" event={"ID":"17373bf5-4311-4847-bfbf-3b346a214d8c","Type":"ContainerStarted","Data":"094141288aa922e23f72a6fe3235b6503b2e76ee3aa69151b539fda9feb5ee09"} Feb 25 16:02:05 crc kubenswrapper[4937]: I0225 16:02:05.617761 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q6pxs" podStartSLOduration=2.050621738 podStartE2EDuration="6.617731143s" podCreationTimestamp="2026-02-25 16:01:59 +0000 UTC" firstStartedPulling="2026-02-25 16:02:00.536465755 +0000 UTC m=+971.549857645" lastFinishedPulling="2026-02-25 16:02:05.10357512 +0000 UTC m=+976.116967050" observedRunningTime="2026-02-25 16:02:05.61679566 +0000 UTC m=+976.630187590" watchObservedRunningTime="2026-02-25 16:02:05.617731143 +0000 UTC m=+976.631123073" Feb 25 16:02:05 crc kubenswrapper[4937]: I0225 16:02:05.913962 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533916-54vkc"] Feb 25 16:02:05 crc kubenswrapper[4937]: I0225 16:02:05.923125 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533916-54vkc"] Feb 25 16:02:07 crc kubenswrapper[4937]: I0225 16:02:07.375912 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98e29452-771e-4286-9898-d0b8a14bb0e7" path="/var/lib/kubelet/pods/98e29452-771e-4286-9898-d0b8a14bb0e7/volumes" Feb 25 16:02:07 crc kubenswrapper[4937]: I0225 16:02:07.878559 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cl2zn"] Feb 25 16:02:07 crc kubenswrapper[4937]: I0225 16:02:07.879028 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5" gracePeriod=30 Feb 25 16:02:07 crc kubenswrapper[4937]: I0225 16:02:07.879085 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovn-controller" containerID="cri-o://ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46" gracePeriod=30 Feb 25 16:02:07 crc kubenswrapper[4937]: I0225 16:02:07.879083 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="sbdb" containerID="cri-o://b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55" gracePeriod=30 Feb 25 16:02:07 crc kubenswrapper[4937]: I0225 16:02:07.879061 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovn-acl-logging" containerID="cri-o://cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12" gracePeriod=30 Feb 25 16:02:07 crc kubenswrapper[4937]: I0225 16:02:07.879167 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="northd" containerID="cri-o://f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2" gracePeriod=30 Feb 25 16:02:07 crc kubenswrapper[4937]: I0225 16:02:07.879038 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="kube-rbac-proxy-node" containerID="cri-o://17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d" gracePeriod=30 Feb 25 16:02:07 crc kubenswrapper[4937]: I0225 16:02:07.879341 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="nbdb" containerID="cri-o://6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9" gracePeriod=30 Feb 25 16:02:07 crc kubenswrapper[4937]: I0225 16:02:07.951099 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" containerID="cri-o://e37144d901d75f45a8471889632d291826e668961da7dfc2db6033335001e105" gracePeriod=30 Feb 25 16:02:09 crc kubenswrapper[4937]: I0225 16:02:09.408267 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:02:09 crc kubenswrapper[4937]: I0225 16:02:09.408653 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:02:09 crc kubenswrapper[4937]: I0225 16:02:09.626360 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/3.log" Feb 25 16:02:09 crc kubenswrapper[4937]: I0225 16:02:09.630512 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovn-acl-logging/0.log" Feb 25 16:02:09 crc kubenswrapper[4937]: I0225 16:02:09.631999 4937 generic.go:334] "Generic (PLEG): container finished" podID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerID="cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12" exitCode=143 Feb 25 16:02:09 crc kubenswrapper[4937]: I0225 16:02:09.632075 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerDied","Data":"cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12"} Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.487274 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q6pxs" podUID="17373bf5-4311-4847-bfbf-3b346a214d8c" containerName="registry-server" probeResult="failure" output=< Feb 25 16:02:10 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 16:02:10 crc kubenswrapper[4937]: > Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.637980 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlbgx_f193b13f-50ab-454a-9230-a96922b25186/kube-multus/2.log" Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.638525 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlbgx_f193b13f-50ab-454a-9230-a96922b25186/kube-multus/1.log" Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.638571 4937 generic.go:334] "Generic (PLEG): container finished" podID="f193b13f-50ab-454a-9230-a96922b25186" containerID="0451ad74afcc7887e90ed45f51efddafe48efb2249762c9aa5e2da84d9691199" exitCode=2 Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.638652 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlbgx" event={"ID":"f193b13f-50ab-454a-9230-a96922b25186","Type":"ContainerDied","Data":"0451ad74afcc7887e90ed45f51efddafe48efb2249762c9aa5e2da84d9691199"} Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.638704 4937 scope.go:117] "RemoveContainer" containerID="1d677612e23253e09a2a6bed76138c39ace5b451a67bb9fd309647de2d8b6b02" Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.639131 4937 scope.go:117] "RemoveContainer" containerID="0451ad74afcc7887e90ed45f51efddafe48efb2249762c9aa5e2da84d9691199" Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.654680 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/3.log" Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.656672 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovn-acl-logging/0.log" Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.657002 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovn-controller/0.log" Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.657266 4937 generic.go:334] "Generic (PLEG): container finished" podID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerID="e37144d901d75f45a8471889632d291826e668961da7dfc2db6033335001e105" exitCode=0 Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.657287 4937 generic.go:334] "Generic (PLEG): container finished" podID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerID="b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55" exitCode=0 Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.657296 4937 generic.go:334] "Generic (PLEG): container finished" podID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerID="6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9" exitCode=0 Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.657303 4937 generic.go:334] "Generic (PLEG): container finished" podID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerID="3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5" exitCode=0 Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.657311 4937 generic.go:334] "Generic (PLEG): container finished" podID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerID="17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d" exitCode=0 Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.657317 4937 generic.go:334] "Generic (PLEG): container finished" podID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerID="ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46" exitCode=143 Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.657335 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerDied","Data":"e37144d901d75f45a8471889632d291826e668961da7dfc2db6033335001e105"} Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.657360 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerDied","Data":"b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55"} Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.657370 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerDied","Data":"6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9"} Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.657379 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerDied","Data":"3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5"} Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.657388 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerDied","Data":"17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d"} Feb 25 16:02:10 crc kubenswrapper[4937]: I0225 16:02:10.657397 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerDied","Data":"ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46"} Feb 25 16:02:11 crc kubenswrapper[4937]: I0225 16:02:11.664505 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovnkube-controller/3.log" Feb 25 16:02:11 crc kubenswrapper[4937]: I0225 16:02:11.666122 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovn-acl-logging/0.log" Feb 25 16:02:11 crc kubenswrapper[4937]: I0225 16:02:11.666505 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovn-controller/0.log" Feb 25 16:02:11 crc kubenswrapper[4937]: I0225 16:02:11.666772 4937 generic.go:334] "Generic (PLEG): container finished" podID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerID="f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2" exitCode=0 Feb 25 16:02:11 crc kubenswrapper[4937]: I0225 16:02:11.666806 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerDied","Data":"f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2"} Feb 25 16:02:11 crc kubenswrapper[4937]: E0225 16:02:11.834750 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e37144d901d75f45a8471889632d291826e668961da7dfc2db6033335001e105 is running failed: container process not found" containerID="e37144d901d75f45a8471889632d291826e668961da7dfc2db6033335001e105" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Feb 25 16:02:11 crc kubenswrapper[4937]: E0225 16:02:11.834763 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55 is running failed: container process not found" containerID="b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Feb 25 16:02:11 crc kubenswrapper[4937]: E0225 16:02:11.834771 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9 is running failed: container process not found" containerID="6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Feb 25 16:02:11 crc kubenswrapper[4937]: E0225 16:02:11.835237 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55 is running failed: container process not found" containerID="b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Feb 25 16:02:11 crc kubenswrapper[4937]: E0225 16:02:11.835269 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e37144d901d75f45a8471889632d291826e668961da7dfc2db6033335001e105 is running failed: container process not found" containerID="e37144d901d75f45a8471889632d291826e668961da7dfc2db6033335001e105" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Feb 25 16:02:11 crc kubenswrapper[4937]: E0225 16:02:11.835247 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9 is running failed: container process not found" containerID="6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Feb 25 16:02:11 crc kubenswrapper[4937]: E0225 16:02:11.835612 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e37144d901d75f45a8471889632d291826e668961da7dfc2db6033335001e105 is running failed: container process not found" containerID="e37144d901d75f45a8471889632d291826e668961da7dfc2db6033335001e105" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Feb 25 16:02:11 crc kubenswrapper[4937]: E0225 16:02:11.835627 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9 is running failed: container process not found" containerID="6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Feb 25 16:02:11 crc kubenswrapper[4937]: E0225 16:02:11.835654 4937 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="nbdb" Feb 25 16:02:11 crc kubenswrapper[4937]: E0225 16:02:11.835654 4937 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e37144d901d75f45a8471889632d291826e668961da7dfc2db6033335001e105 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:11 crc kubenswrapper[4937]: E0225 16:02:11.835731 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55 is running failed: container process not found" containerID="b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Feb 25 16:02:11 crc kubenswrapper[4937]: E0225 16:02:11.835748 4937 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="sbdb" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.363833 4937 scope.go:117] "RemoveContainer" containerID="d3ea59bb1816d1d9773c4d501d1e15f12b6727f45cca1fdc4c7b9ebf620942ee" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.397898 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovn-acl-logging/0.log" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.398465 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovn-controller/0.log" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.398907 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.472866 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t9fdh"] Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473066 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473077 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473085 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovn-acl-logging" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473090 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovn-acl-logging" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473100 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="northd" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473107 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="northd" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473118 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="nbdb" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473123 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="nbdb" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473132 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6a5911-b7a5-4523-bead-543b0a0ccdcc" containerName="oc" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473138 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6a5911-b7a5-4523-bead-543b0a0ccdcc" containerName="oc" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473147 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473152 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473159 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473164 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473172 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="kube-rbac-proxy-node" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473178 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="kube-rbac-proxy-node" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473186 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="sbdb" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473191 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="sbdb" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473199 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b58d852-ef69-4a94-8e1b-8892612ff7aa" containerName="util" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473205 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b58d852-ef69-4a94-8e1b-8892612ff7aa" containerName="util" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473214 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovn-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473219 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovn-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473228 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b58d852-ef69-4a94-8e1b-8892612ff7aa" containerName="extract" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473233 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b58d852-ef69-4a94-8e1b-8892612ff7aa" containerName="extract" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473242 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="kube-rbac-proxy-ovn-metrics" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473247 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="kube-rbac-proxy-ovn-metrics" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473256 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="kubecfg-setup" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473262 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="kubecfg-setup" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473270 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b58d852-ef69-4a94-8e1b-8892612ff7aa" containerName="pull" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473276 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b58d852-ef69-4a94-8e1b-8892612ff7aa" containerName="pull" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473369 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="northd" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473379 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovn-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473386 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473393 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473403 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473410 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="kube-rbac-proxy-ovn-metrics" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473422 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b58d852-ef69-4a94-8e1b-8892612ff7aa" containerName="extract" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473430 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="kube-rbac-proxy-node" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473436 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="nbdb" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473444 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="sbdb" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473452 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovn-acl-logging" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473459 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473465 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6a5911-b7a5-4523-bead-543b0a0ccdcc" containerName="oc" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473560 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473567 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473647 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: E0225 16:02:12.473736 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.473743 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" containerName="ovnkube-controller" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474295 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-run-netns\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474324 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-log-socket\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474382 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-env-overrides\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474404 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-ovn\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474435 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-cni-bin\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474454 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-run-ovn-kubernetes\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474477 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-openvswitch\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474519 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-ovnkube-script-lib\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474536 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-systemd\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474564 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2px4\" (UniqueName: \"kubernetes.io/projected/89a5d3cb-d884-4e27-90df-972e98830bcb-kube-api-access-r2px4\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474580 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474597 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-cni-netd\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474621 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-var-lib-openvswitch\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474637 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-slash\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474677 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89a5d3cb-d884-4e27-90df-972e98830bcb-ovn-node-metrics-cert\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474702 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-kubelet\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474722 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-ovnkube-config\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474740 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-systemd-units\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474754 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-node-log\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474775 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-etc-openvswitch\") pod \"89a5d3cb-d884-4e27-90df-972e98830bcb\" (UID: \"89a5d3cb-d884-4e27-90df-972e98830bcb\") " Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.474975 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.475011 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.475029 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-log-socket" (OuterVolumeSpecName: "log-socket") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.475128 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.475321 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.475351 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.475387 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.475404 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.475422 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.475684 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-slash" (OuterVolumeSpecName: "host-slash") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.475775 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.476441 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.476766 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.476805 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.476800 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.476834 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-node-log" (OuterVolumeSpecName: "node-log") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.476844 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.477239 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.485825 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a5d3cb-d884-4e27-90df-972e98830bcb-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.485929 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89a5d3cb-d884-4e27-90df-972e98830bcb-kube-api-access-r2px4" (OuterVolumeSpecName: "kube-api-access-r2px4") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "kube-api-access-r2px4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.522195 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "89a5d3cb-d884-4e27-90df-972e98830bcb" (UID: "89a5d3cb-d884-4e27-90df-972e98830bcb"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.575974 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-log-socket\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576039 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-etc-openvswitch\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576087 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-run-ovn\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576122 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576207 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-node-log\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576260 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-run-netns\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576304 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61ab1c4f-9e4e-478e-aaa6-1112641048d0-env-overrides\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576326 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-run-ovn-kubernetes\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576364 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-systemd-units\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576390 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61ab1c4f-9e4e-478e-aaa6-1112641048d0-ovn-node-metrics-cert\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576436 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-kubelet\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576512 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnjxc\" (UniqueName: \"kubernetes.io/projected/61ab1c4f-9e4e-478e-aaa6-1112641048d0-kube-api-access-mnjxc\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576533 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61ab1c4f-9e4e-478e-aaa6-1112641048d0-ovnkube-script-lib\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576549 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-run-openvswitch\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576595 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-var-lib-openvswitch\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576626 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-run-systemd\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576644 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61ab1c4f-9e4e-478e-aaa6-1112641048d0-ovnkube-config\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576660 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-cni-netd\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576677 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-cni-bin\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576712 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-slash\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576752 4937 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-slash\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576763 4937 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89a5d3cb-d884-4e27-90df-972e98830bcb-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576772 4937 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576781 4937 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576791 4937 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576800 4937 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-node-log\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576808 4937 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576816 4937 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576824 4937 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-log-socket\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576833 4937 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576841 4937 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576849 4937 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576858 4937 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576866 4937 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576875 4937 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89a5d3cb-d884-4e27-90df-972e98830bcb-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576883 4937 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576891 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2px4\" (UniqueName: \"kubernetes.io/projected/89a5d3cb-d884-4e27-90df-972e98830bcb-kube-api-access-r2px4\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576899 4937 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576910 4937 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.576919 4937 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89a5d3cb-d884-4e27-90df-972e98830bcb-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.675074 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovn-acl-logging/0.log" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.675570 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cl2zn_89a5d3cb-d884-4e27-90df-972e98830bcb/ovn-controller/0.log" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.676071 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.676066 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cl2zn" event={"ID":"89a5d3cb-d884-4e27-90df-972e98830bcb","Type":"ContainerDied","Data":"68e4cb7e0fbef2b311822beb59b0976faca4be58ed711777bb3daa0e8856b5cd"} Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.676188 4937 scope.go:117] "RemoveContainer" containerID="e37144d901d75f45a8471889632d291826e668961da7dfc2db6033335001e105" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.677877 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61ab1c4f-9e4e-478e-aaa6-1112641048d0-ovnkube-script-lib\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.677904 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-run-openvswitch\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.677925 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-var-lib-openvswitch\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.677941 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-run-systemd\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.677956 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61ab1c4f-9e4e-478e-aaa6-1112641048d0-ovnkube-config\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.677969 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-cni-netd\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.677985 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-cni-bin\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678004 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-slash\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678029 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-log-socket\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678051 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-etc-openvswitch\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678067 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-run-ovn\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678086 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678103 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-node-log\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678118 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-run-netns\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678137 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61ab1c4f-9e4e-478e-aaa6-1112641048d0-env-overrides\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678153 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-run-ovn-kubernetes\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678166 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlbgx_f193b13f-50ab-454a-9230-a96922b25186/kube-multus/2.log" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678203 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-systemd-units\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678204 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlbgx" event={"ID":"f193b13f-50ab-454a-9230-a96922b25186","Type":"ContainerStarted","Data":"6e7e07a21a6b39ff49cb1f7f49c04929415cfcde32af4d6be35ba1f3dd894faf"} Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678170 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-systemd-units\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678827 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61ab1c4f-9e4e-478e-aaa6-1112641048d0-ovnkube-script-lib\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678863 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-run-openvswitch\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678870 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61ab1c4f-9e4e-478e-aaa6-1112641048d0-ovn-node-metrics-cert\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678885 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-var-lib-openvswitch\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678905 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-run-systemd\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678904 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-kubelet\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678933 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnjxc\" (UniqueName: \"kubernetes.io/projected/61ab1c4f-9e4e-478e-aaa6-1112641048d0-kube-api-access-mnjxc\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.678945 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-kubelet\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.679528 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61ab1c4f-9e4e-478e-aaa6-1112641048d0-ovnkube-config\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.679579 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-cni-netd\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.679604 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-cni-bin\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.679624 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-slash\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.679645 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-log-socket\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.679664 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-etc-openvswitch\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.679683 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-run-ovn\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.679703 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.679724 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-node-log\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.679743 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-run-netns\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.680022 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61ab1c4f-9e4e-478e-aaa6-1112641048d0-env-overrides\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.680058 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61ab1c4f-9e4e-478e-aaa6-1112641048d0-host-run-ovn-kubernetes\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.682461 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61ab1c4f-9e4e-478e-aaa6-1112641048d0-ovn-node-metrics-cert\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.702065 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnjxc\" (UniqueName: \"kubernetes.io/projected/61ab1c4f-9e4e-478e-aaa6-1112641048d0-kube-api-access-mnjxc\") pod \"ovnkube-node-t9fdh\" (UID: \"61ab1c4f-9e4e-478e-aaa6-1112641048d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.702893 4937 scope.go:117] "RemoveContainer" containerID="b1215a60f6b211f4f611a364954d009f73984d481577d52256a7af0b1c7bed55" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.715768 4937 scope.go:117] "RemoveContainer" containerID="6773ea7af6ecd5a87dc7dee7ab7e45cbabf0401989362facbdca6958c69022a9" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.733394 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cl2zn"] Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.734620 4937 scope.go:117] "RemoveContainer" containerID="f2ab439dbe768a90bbe299eb0d8fc5aa100c6dd21ebfd6d314759ac28869e6c2" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.740761 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cl2zn"] Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.757659 4937 scope.go:117] "RemoveContainer" containerID="3f4bdb1a6fc97b7477c1c0af7bc5a300fb07506a703d4e8a2f2cef3c58870cf5" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.771426 4937 scope.go:117] "RemoveContainer" containerID="17ec15b3ce3443536d21f91fe425d1818162f249600224c21ea288c6142c645d" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.785044 4937 scope.go:117] "RemoveContainer" containerID="cd52aaa215452bb8e332b8ff0ddcbf553f4596e268af734cc9fd16662c532d12" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.800975 4937 scope.go:117] "RemoveContainer" containerID="ce4d4cf6c3867dc41e5a6be8f4a91068263ccde3885780a7129c09d2f5dc7d46" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.814945 4937 scope.go:117] "RemoveContainer" containerID="1fbff2139a8f47941875ec7bf3416da8ec161576f6e0628420253ad192662411" Feb 25 16:02:12 crc kubenswrapper[4937]: I0225 16:02:12.832477 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:12 crc kubenswrapper[4937]: W0225 16:02:12.850805 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61ab1c4f_9e4e_478e_aaa6_1112641048d0.slice/crio-e207122e24b8dd95cdb64f073373775daac2ee4d6df445191339bf18251bc0aa WatchSource:0}: Error finding container e207122e24b8dd95cdb64f073373775daac2ee4d6df445191339bf18251bc0aa: Status 404 returned error can't find the container with id e207122e24b8dd95cdb64f073373775daac2ee4d6df445191339bf18251bc0aa Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.373117 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89a5d3cb-d884-4e27-90df-972e98830bcb" path="/var/lib/kubelet/pods/89a5d3cb-d884-4e27-90df-972e98830bcb/volumes" Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.684933 4937 generic.go:334] "Generic (PLEG): container finished" podID="61ab1c4f-9e4e-478e-aaa6-1112641048d0" containerID="755ddbaad8d4e01d2737b3166612fa06e6366c4011e2aa8abe03fcc4c189c5ad" exitCode=0 Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.685039 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" event={"ID":"61ab1c4f-9e4e-478e-aaa6-1112641048d0","Type":"ContainerDied","Data":"755ddbaad8d4e01d2737b3166612fa06e6366c4011e2aa8abe03fcc4c189c5ad"} Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.685064 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" event={"ID":"61ab1c4f-9e4e-478e-aaa6-1112641048d0","Type":"ContainerStarted","Data":"e207122e24b8dd95cdb64f073373775daac2ee4d6df445191339bf18251bc0aa"} Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.861540 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9"] Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.862324 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.864312 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-jb5wc" Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.864627 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.864785 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.892571 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b"] Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.893339 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.895994 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-wmsf4" Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.896652 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.903226 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9"] Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.903930 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.991778 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f8315c1-97ca-4525-a1a8-afe98581f614-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9\" (UID: \"8f8315c1-97ca-4525-a1a8-afe98581f614\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.991823 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f8315c1-97ca-4525-a1a8-afe98581f614-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9\" (UID: \"8f8315c1-97ca-4525-a1a8-afe98581f614\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.991849 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1cad8e2a-5182-4d59-9afa-c64ced98e87b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b\" (UID: \"1cad8e2a-5182-4d59-9afa-c64ced98e87b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.991870 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9x78\" (UniqueName: \"kubernetes.io/projected/4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af-kube-api-access-q9x78\") pod \"obo-prometheus-operator-68bc856cb9-46sv9\" (UID: \"4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" Feb 25 16:02:13 crc kubenswrapper[4937]: I0225 16:02:13.991979 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1cad8e2a-5182-4d59-9afa-c64ced98e87b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b\" (UID: \"1cad8e2a-5182-4d59-9afa-c64ced98e87b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.080994 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-p5kbh"] Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.081743 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.084061 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.084310 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-6zhfm" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.092956 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1cad8e2a-5182-4d59-9afa-c64ced98e87b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b\" (UID: \"1cad8e2a-5182-4d59-9afa-c64ced98e87b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.093072 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f8315c1-97ca-4525-a1a8-afe98581f614-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9\" (UID: \"8f8315c1-97ca-4525-a1a8-afe98581f614\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.093093 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f8315c1-97ca-4525-a1a8-afe98581f614-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9\" (UID: \"8f8315c1-97ca-4525-a1a8-afe98581f614\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.093118 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1cad8e2a-5182-4d59-9afa-c64ced98e87b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b\" (UID: \"1cad8e2a-5182-4d59-9afa-c64ced98e87b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.093139 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9x78\" (UniqueName: \"kubernetes.io/projected/4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af-kube-api-access-q9x78\") pod \"obo-prometheus-operator-68bc856cb9-46sv9\" (UID: \"4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.097624 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f8315c1-97ca-4525-a1a8-afe98581f614-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9\" (UID: \"8f8315c1-97ca-4525-a1a8-afe98581f614\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.097655 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1cad8e2a-5182-4d59-9afa-c64ced98e87b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b\" (UID: \"1cad8e2a-5182-4d59-9afa-c64ced98e87b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.098965 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f8315c1-97ca-4525-a1a8-afe98581f614-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9\" (UID: \"8f8315c1-97ca-4525-a1a8-afe98581f614\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.099922 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1cad8e2a-5182-4d59-9afa-c64ced98e87b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b\" (UID: \"1cad8e2a-5182-4d59-9afa-c64ced98e87b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.113196 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9x78\" (UniqueName: \"kubernetes.io/projected/4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af-kube-api-access-q9x78\") pod \"obo-prometheus-operator-68bc856cb9-46sv9\" (UID: \"4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.176226 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.184700 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-prw69"] Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.185665 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.187953 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-6ptxn" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.194463 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7bfk\" (UniqueName: \"kubernetes.io/projected/0eb822b0-826a-4b2d-9376-141a69ba37e5-kube-api-access-v7bfk\") pod \"observability-operator-59bdc8b94-p5kbh\" (UID: \"0eb822b0-826a-4b2d-9376-141a69ba37e5\") " pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.194564 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0eb822b0-826a-4b2d-9376-141a69ba37e5-observability-operator-tls\") pod \"observability-operator-59bdc8b94-p5kbh\" (UID: \"0eb822b0-826a-4b2d-9376-141a69ba37e5\") " pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.203995 4937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-46sv9_openshift-operators_4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af_0(4585e7e55407892da7fef2f63a066546f683cba8e5210db982d0bd7272e0112c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.204068 4937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-46sv9_openshift-operators_4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af_0(4585e7e55407892da7fef2f63a066546f683cba8e5210db982d0bd7272e0112c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.204097 4937 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-46sv9_openshift-operators_4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af_0(4585e7e55407892da7fef2f63a066546f683cba8e5210db982d0bd7272e0112c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.204141 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-46sv9_openshift-operators(4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-46sv9_openshift-operators(4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-46sv9_openshift-operators_4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af_0(4585e7e55407892da7fef2f63a066546f683cba8e5210db982d0bd7272e0112c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" podUID="4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.209379 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.227224 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.232720 4937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_openshift-operators_1cad8e2a-5182-4d59-9afa-c64ced98e87b_0(a424de19ab3003b7eb79a6dd795e3a616b136dd3bd680e89a112d43b9c712790): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.232785 4937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_openshift-operators_1cad8e2a-5182-4d59-9afa-c64ced98e87b_0(a424de19ab3003b7eb79a6dd795e3a616b136dd3bd680e89a112d43b9c712790): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.232806 4937 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_openshift-operators_1cad8e2a-5182-4d59-9afa-c64ced98e87b_0(a424de19ab3003b7eb79a6dd795e3a616b136dd3bd680e89a112d43b9c712790): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.232843 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_openshift-operators(1cad8e2a-5182-4d59-9afa-c64ced98e87b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_openshift-operators(1cad8e2a-5182-4d59-9afa-c64ced98e87b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_openshift-operators_1cad8e2a-5182-4d59-9afa-c64ced98e87b_0(a424de19ab3003b7eb79a6dd795e3a616b136dd3bd680e89a112d43b9c712790): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" podUID="1cad8e2a-5182-4d59-9afa-c64ced98e87b" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.268468 4937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_openshift-operators_8f8315c1-97ca-4525-a1a8-afe98581f614_0(bd7dcf28b72d1fab3d33e4432ddfbf36def77d9565e396200e7316ea986e8e53): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.268598 4937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_openshift-operators_8f8315c1-97ca-4525-a1a8-afe98581f614_0(bd7dcf28b72d1fab3d33e4432ddfbf36def77d9565e396200e7316ea986e8e53): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.268628 4937 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_openshift-operators_8f8315c1-97ca-4525-a1a8-afe98581f614_0(bd7dcf28b72d1fab3d33e4432ddfbf36def77d9565e396200e7316ea986e8e53): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.269755 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_openshift-operators(8f8315c1-97ca-4525-a1a8-afe98581f614)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_openshift-operators(8f8315c1-97ca-4525-a1a8-afe98581f614)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_openshift-operators_8f8315c1-97ca-4525-a1a8-afe98581f614_0(bd7dcf28b72d1fab3d33e4432ddfbf36def77d9565e396200e7316ea986e8e53): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" podUID="8f8315c1-97ca-4525-a1a8-afe98581f614" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.296066 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/26437cd5-3ce5-4d7a-9b7f-9f983015f74d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-prw69\" (UID: \"26437cd5-3ce5-4d7a-9b7f-9f983015f74d\") " pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.296137 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7bfk\" (UniqueName: \"kubernetes.io/projected/0eb822b0-826a-4b2d-9376-141a69ba37e5-kube-api-access-v7bfk\") pod \"observability-operator-59bdc8b94-p5kbh\" (UID: \"0eb822b0-826a-4b2d-9376-141a69ba37e5\") " pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.296165 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdf9b\" (UniqueName: \"kubernetes.io/projected/26437cd5-3ce5-4d7a-9b7f-9f983015f74d-kube-api-access-hdf9b\") pod \"perses-operator-5bf474d74f-prw69\" (UID: \"26437cd5-3ce5-4d7a-9b7f-9f983015f74d\") " pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.296259 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0eb822b0-826a-4b2d-9376-141a69ba37e5-observability-operator-tls\") pod \"observability-operator-59bdc8b94-p5kbh\" (UID: \"0eb822b0-826a-4b2d-9376-141a69ba37e5\") " pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.305322 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0eb822b0-826a-4b2d-9376-141a69ba37e5-observability-operator-tls\") pod \"observability-operator-59bdc8b94-p5kbh\" (UID: \"0eb822b0-826a-4b2d-9376-141a69ba37e5\") " pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.330546 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7bfk\" (UniqueName: \"kubernetes.io/projected/0eb822b0-826a-4b2d-9376-141a69ba37e5-kube-api-access-v7bfk\") pod \"observability-operator-59bdc8b94-p5kbh\" (UID: \"0eb822b0-826a-4b2d-9376-141a69ba37e5\") " pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.398812 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.400162 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdf9b\" (UniqueName: \"kubernetes.io/projected/26437cd5-3ce5-4d7a-9b7f-9f983015f74d-kube-api-access-hdf9b\") pod \"perses-operator-5bf474d74f-prw69\" (UID: \"26437cd5-3ce5-4d7a-9b7f-9f983015f74d\") " pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.400289 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/26437cd5-3ce5-4d7a-9b7f-9f983015f74d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-prw69\" (UID: \"26437cd5-3ce5-4d7a-9b7f-9f983015f74d\") " pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.401376 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/26437cd5-3ce5-4d7a-9b7f-9f983015f74d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-prw69\" (UID: \"26437cd5-3ce5-4d7a-9b7f-9f983015f74d\") " pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.424580 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdf9b\" (UniqueName: \"kubernetes.io/projected/26437cd5-3ce5-4d7a-9b7f-9f983015f74d-kube-api-access-hdf9b\") pod \"perses-operator-5bf474d74f-prw69\" (UID: \"26437cd5-3ce5-4d7a-9b7f-9f983015f74d\") " pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.460677 4937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-p5kbh_openshift-operators_0eb822b0-826a-4b2d-9376-141a69ba37e5_0(5a23dc049ecfa8192d7d0aba6e91e94a82348d5e6734e159b7409c41bf084cb8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.460743 4937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-p5kbh_openshift-operators_0eb822b0-826a-4b2d-9376-141a69ba37e5_0(5a23dc049ecfa8192d7d0aba6e91e94a82348d5e6734e159b7409c41bf084cb8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.460766 4937 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-p5kbh_openshift-operators_0eb822b0-826a-4b2d-9376-141a69ba37e5_0(5a23dc049ecfa8192d7d0aba6e91e94a82348d5e6734e159b7409c41bf084cb8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.460811 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-p5kbh_openshift-operators(0eb822b0-826a-4b2d-9376-141a69ba37e5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-p5kbh_openshift-operators(0eb822b0-826a-4b2d-9376-141a69ba37e5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-p5kbh_openshift-operators_0eb822b0-826a-4b2d-9376-141a69ba37e5_0(5a23dc049ecfa8192d7d0aba6e91e94a82348d5e6734e159b7409c41bf084cb8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" podUID="0eb822b0-826a-4b2d-9376-141a69ba37e5" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.525812 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.550188 4937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-prw69_openshift-operators_26437cd5-3ce5-4d7a-9b7f-9f983015f74d_0(7287493e4e3abc6c2a596ea92a44ac06d93b4119eac9f3e53777d900576936a7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.550263 4937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-prw69_openshift-operators_26437cd5-3ce5-4d7a-9b7f-9f983015f74d_0(7287493e4e3abc6c2a596ea92a44ac06d93b4119eac9f3e53777d900576936a7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.550291 4937 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-prw69_openshift-operators_26437cd5-3ce5-4d7a-9b7f-9f983015f74d_0(7287493e4e3abc6c2a596ea92a44ac06d93b4119eac9f3e53777d900576936a7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:14 crc kubenswrapper[4937]: E0225 16:02:14.550347 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-prw69_openshift-operators(26437cd5-3ce5-4d7a-9b7f-9f983015f74d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-prw69_openshift-operators(26437cd5-3ce5-4d7a-9b7f-9f983015f74d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-prw69_openshift-operators_26437cd5-3ce5-4d7a-9b7f-9f983015f74d_0(7287493e4e3abc6c2a596ea92a44ac06d93b4119eac9f3e53777d900576936a7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-prw69" podUID="26437cd5-3ce5-4d7a-9b7f-9f983015f74d" Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.695758 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" event={"ID":"61ab1c4f-9e4e-478e-aaa6-1112641048d0","Type":"ContainerStarted","Data":"2cda6be827871b3bd529944f650686243d2de7c2d37528a44c9bf4ba68af765f"} Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.696104 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" event={"ID":"61ab1c4f-9e4e-478e-aaa6-1112641048d0","Type":"ContainerStarted","Data":"5bdf1cad237ef70df03fe768f1a50026b0a5850f6d210497bc109a416870d88f"} Feb 25 16:02:14 crc kubenswrapper[4937]: I0225 16:02:14.696115 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" event={"ID":"61ab1c4f-9e4e-478e-aaa6-1112641048d0","Type":"ContainerStarted","Data":"304d07cef4fe51f46005f212d98afb883e83944f5c413ffb492fdc7b87169a93"} Feb 25 16:02:15 crc kubenswrapper[4937]: I0225 16:02:15.702803 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" event={"ID":"61ab1c4f-9e4e-478e-aaa6-1112641048d0","Type":"ContainerStarted","Data":"9bec31bad3d8b1ab03f93ccd550ddc420264600d15865e41d911b61e3b5b2b8d"} Feb 25 16:02:15 crc kubenswrapper[4937]: I0225 16:02:15.702845 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" event={"ID":"61ab1c4f-9e4e-478e-aaa6-1112641048d0","Type":"ContainerStarted","Data":"31a8057ca6d68d92e8e432a21935371a3f88cd085983730a8e9a10de9f95ba9f"} Feb 25 16:02:15 crc kubenswrapper[4937]: I0225 16:02:15.702854 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" event={"ID":"61ab1c4f-9e4e-478e-aaa6-1112641048d0","Type":"ContainerStarted","Data":"00b6aeeb1f697a48df1dc4e51a33b02c3d855699eaecc5c41450e9b1e0b16e54"} Feb 25 16:02:17 crc kubenswrapper[4937]: I0225 16:02:17.719418 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" event={"ID":"61ab1c4f-9e4e-478e-aaa6-1112641048d0","Type":"ContainerStarted","Data":"1f3ae416ba83597e7e3547cdb8ae50c06dfd26a1aa7ec54783b982a17d56360c"} Feb 25 16:02:19 crc kubenswrapper[4937]: I0225 16:02:19.441723 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:02:19 crc kubenswrapper[4937]: I0225 16:02:19.505414 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.737208 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" event={"ID":"61ab1c4f-9e4e-478e-aaa6-1112641048d0","Type":"ContainerStarted","Data":"6b41c4a5b54dae4c3ec31c63a28b55a46654303ac2ae866464f7a3f6fc6c734f"} Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.737757 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.737790 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.769210 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.770941 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" podStartSLOduration=8.770917302 podStartE2EDuration="8.770917302s" podCreationTimestamp="2026-02-25 16:02:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:02:20.761530767 +0000 UTC m=+991.774922677" watchObservedRunningTime="2026-02-25 16:02:20.770917302 +0000 UTC m=+991.784309192" Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.859294 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-p5kbh"] Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.859439 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.859901 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.863862 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b"] Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.863986 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.864385 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.873544 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9"] Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.873673 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.874058 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.877330 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-prw69"] Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.877466 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.877946 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.880556 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9"] Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.880624 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:20 crc kubenswrapper[4937]: I0225 16:02:20.880870 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.916043 4937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-p5kbh_openshift-operators_0eb822b0-826a-4b2d-9376-141a69ba37e5_0(b74dcc0c37736188645aee467eaa6d3b7dfa34e1d5e8ef3cf72f72de2c4ba3d5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.916106 4937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-p5kbh_openshift-operators_0eb822b0-826a-4b2d-9376-141a69ba37e5_0(b74dcc0c37736188645aee467eaa6d3b7dfa34e1d5e8ef3cf72f72de2c4ba3d5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.916125 4937 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-p5kbh_openshift-operators_0eb822b0-826a-4b2d-9376-141a69ba37e5_0(b74dcc0c37736188645aee467eaa6d3b7dfa34e1d5e8ef3cf72f72de2c4ba3d5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.916165 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-p5kbh_openshift-operators(0eb822b0-826a-4b2d-9376-141a69ba37e5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-p5kbh_openshift-operators(0eb822b0-826a-4b2d-9376-141a69ba37e5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-p5kbh_openshift-operators_0eb822b0-826a-4b2d-9376-141a69ba37e5_0(b74dcc0c37736188645aee467eaa6d3b7dfa34e1d5e8ef3cf72f72de2c4ba3d5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" podUID="0eb822b0-826a-4b2d-9376-141a69ba37e5" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.944521 4937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_openshift-operators_1cad8e2a-5182-4d59-9afa-c64ced98e87b_0(d31292746006345e02630b7b4ddc0a70dfcbf1cddaa38729d5fb950b31d1b03c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.944575 4937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_openshift-operators_1cad8e2a-5182-4d59-9afa-c64ced98e87b_0(d31292746006345e02630b7b4ddc0a70dfcbf1cddaa38729d5fb950b31d1b03c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.944596 4937 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_openshift-operators_1cad8e2a-5182-4d59-9afa-c64ced98e87b_0(d31292746006345e02630b7b4ddc0a70dfcbf1cddaa38729d5fb950b31d1b03c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.944641 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_openshift-operators(1cad8e2a-5182-4d59-9afa-c64ced98e87b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_openshift-operators(1cad8e2a-5182-4d59-9afa-c64ced98e87b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_openshift-operators_1cad8e2a-5182-4d59-9afa-c64ced98e87b_0(d31292746006345e02630b7b4ddc0a70dfcbf1cddaa38729d5fb950b31d1b03c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" podUID="1cad8e2a-5182-4d59-9afa-c64ced98e87b" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.954057 4937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-46sv9_openshift-operators_4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af_0(bce5802f2aee4d1cbe56d23fa07f796459049bf908d8a6822e3e53b13af9fcd8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.954131 4937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-46sv9_openshift-operators_4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af_0(bce5802f2aee4d1cbe56d23fa07f796459049bf908d8a6822e3e53b13af9fcd8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.954159 4937 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-46sv9_openshift-operators_4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af_0(bce5802f2aee4d1cbe56d23fa07f796459049bf908d8a6822e3e53b13af9fcd8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.954212 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-46sv9_openshift-operators(4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-46sv9_openshift-operators(4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-46sv9_openshift-operators_4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af_0(bce5802f2aee4d1cbe56d23fa07f796459049bf908d8a6822e3e53b13af9fcd8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" podUID="4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.979668 4937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-prw69_openshift-operators_26437cd5-3ce5-4d7a-9b7f-9f983015f74d_0(0a91fdbe4f8c7d8306fbca45567723838f7f0b46641796c92db1f2f3fe9aaf3f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.979729 4937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-prw69_openshift-operators_26437cd5-3ce5-4d7a-9b7f-9f983015f74d_0(0a91fdbe4f8c7d8306fbca45567723838f7f0b46641796c92db1f2f3fe9aaf3f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.979750 4937 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-prw69_openshift-operators_26437cd5-3ce5-4d7a-9b7f-9f983015f74d_0(0a91fdbe4f8c7d8306fbca45567723838f7f0b46641796c92db1f2f3fe9aaf3f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.979786 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-prw69_openshift-operators(26437cd5-3ce5-4d7a-9b7f-9f983015f74d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-prw69_openshift-operators(26437cd5-3ce5-4d7a-9b7f-9f983015f74d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-prw69_openshift-operators_26437cd5-3ce5-4d7a-9b7f-9f983015f74d_0(0a91fdbe4f8c7d8306fbca45567723838f7f0b46641796c92db1f2f3fe9aaf3f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-prw69" podUID="26437cd5-3ce5-4d7a-9b7f-9f983015f74d" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.998090 4937 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_openshift-operators_8f8315c1-97ca-4525-a1a8-afe98581f614_0(6390ae2708b88ca4714e48a37804169b68fec2e32a5965826a4dc9a29df2e8c4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.998152 4937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_openshift-operators_8f8315c1-97ca-4525-a1a8-afe98581f614_0(6390ae2708b88ca4714e48a37804169b68fec2e32a5965826a4dc9a29df2e8c4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.998178 4937 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_openshift-operators_8f8315c1-97ca-4525-a1a8-afe98581f614_0(6390ae2708b88ca4714e48a37804169b68fec2e32a5965826a4dc9a29df2e8c4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:20 crc kubenswrapper[4937]: E0225 16:02:20.998225 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_openshift-operators(8f8315c1-97ca-4525-a1a8-afe98581f614)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_openshift-operators(8f8315c1-97ca-4525-a1a8-afe98581f614)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_openshift-operators_8f8315c1-97ca-4525-a1a8-afe98581f614_0(6390ae2708b88ca4714e48a37804169b68fec2e32a5965826a4dc9a29df2e8c4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" podUID="8f8315c1-97ca-4525-a1a8-afe98581f614" Feb 25 16:02:21 crc kubenswrapper[4937]: I0225 16:02:21.729138 4937 scope.go:117] "RemoveContainer" containerID="db6e92b472e6c485207bd90cc9b71ceb4b2506038df4c6f7fd5b19e7900b60ce" Feb 25 16:02:21 crc kubenswrapper[4937]: I0225 16:02:21.744381 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:21 crc kubenswrapper[4937]: I0225 16:02:21.773201 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:21 crc kubenswrapper[4937]: I0225 16:02:21.855787 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q6pxs"] Feb 25 16:02:21 crc kubenswrapper[4937]: I0225 16:02:21.856019 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q6pxs" podUID="17373bf5-4311-4847-bfbf-3b346a214d8c" containerName="registry-server" containerID="cri-o://094141288aa922e23f72a6fe3235b6503b2e76ee3aa69151b539fda9feb5ee09" gracePeriod=2 Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.056117 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.201792 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17373bf5-4311-4847-bfbf-3b346a214d8c-catalog-content\") pod \"17373bf5-4311-4847-bfbf-3b346a214d8c\" (UID: \"17373bf5-4311-4847-bfbf-3b346a214d8c\") " Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.201966 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17373bf5-4311-4847-bfbf-3b346a214d8c-utilities\") pod \"17373bf5-4311-4847-bfbf-3b346a214d8c\" (UID: \"17373bf5-4311-4847-bfbf-3b346a214d8c\") " Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.202001 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bbst\" (UniqueName: \"kubernetes.io/projected/17373bf5-4311-4847-bfbf-3b346a214d8c-kube-api-access-9bbst\") pod \"17373bf5-4311-4847-bfbf-3b346a214d8c\" (UID: \"17373bf5-4311-4847-bfbf-3b346a214d8c\") " Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.202844 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17373bf5-4311-4847-bfbf-3b346a214d8c-utilities" (OuterVolumeSpecName: "utilities") pod "17373bf5-4311-4847-bfbf-3b346a214d8c" (UID: "17373bf5-4311-4847-bfbf-3b346a214d8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.207674 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17373bf5-4311-4847-bfbf-3b346a214d8c-kube-api-access-9bbst" (OuterVolumeSpecName: "kube-api-access-9bbst") pod "17373bf5-4311-4847-bfbf-3b346a214d8c" (UID: "17373bf5-4311-4847-bfbf-3b346a214d8c"). InnerVolumeSpecName "kube-api-access-9bbst". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.305093 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17373bf5-4311-4847-bfbf-3b346a214d8c-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.305358 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bbst\" (UniqueName: \"kubernetes.io/projected/17373bf5-4311-4847-bfbf-3b346a214d8c-kube-api-access-9bbst\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.396802 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17373bf5-4311-4847-bfbf-3b346a214d8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17373bf5-4311-4847-bfbf-3b346a214d8c" (UID: "17373bf5-4311-4847-bfbf-3b346a214d8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.406389 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17373bf5-4311-4847-bfbf-3b346a214d8c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.750573 4937 generic.go:334] "Generic (PLEG): container finished" podID="17373bf5-4311-4847-bfbf-3b346a214d8c" containerID="094141288aa922e23f72a6fe3235b6503b2e76ee3aa69151b539fda9feb5ee09" exitCode=0 Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.750660 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6pxs" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.750761 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6pxs" event={"ID":"17373bf5-4311-4847-bfbf-3b346a214d8c","Type":"ContainerDied","Data":"094141288aa922e23f72a6fe3235b6503b2e76ee3aa69151b539fda9feb5ee09"} Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.750794 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6pxs" event={"ID":"17373bf5-4311-4847-bfbf-3b346a214d8c","Type":"ContainerDied","Data":"504a6beed0bcda7ee319432acfa600b9f29a65857a186c2b14ace1390bd2da32"} Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.750816 4937 scope.go:117] "RemoveContainer" containerID="094141288aa922e23f72a6fe3235b6503b2e76ee3aa69151b539fda9feb5ee09" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.767281 4937 scope.go:117] "RemoveContainer" containerID="8ff9e6538a27044eaa3a44f33f87566cffc7cae195ab08b6b6f0eb611c987241" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.788593 4937 scope.go:117] "RemoveContainer" containerID="2ba8f41d8fda95489d38ef00cafd16ede50d5c099ce5bf393f4739f4478d0832" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.791295 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q6pxs"] Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.795883 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q6pxs"] Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.802623 4937 scope.go:117] "RemoveContainer" containerID="094141288aa922e23f72a6fe3235b6503b2e76ee3aa69151b539fda9feb5ee09" Feb 25 16:02:22 crc kubenswrapper[4937]: E0225 16:02:22.803038 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"094141288aa922e23f72a6fe3235b6503b2e76ee3aa69151b539fda9feb5ee09\": container with ID starting with 094141288aa922e23f72a6fe3235b6503b2e76ee3aa69151b539fda9feb5ee09 not found: ID does not exist" containerID="094141288aa922e23f72a6fe3235b6503b2e76ee3aa69151b539fda9feb5ee09" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.803069 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"094141288aa922e23f72a6fe3235b6503b2e76ee3aa69151b539fda9feb5ee09"} err="failed to get container status \"094141288aa922e23f72a6fe3235b6503b2e76ee3aa69151b539fda9feb5ee09\": rpc error: code = NotFound desc = could not find container \"094141288aa922e23f72a6fe3235b6503b2e76ee3aa69151b539fda9feb5ee09\": container with ID starting with 094141288aa922e23f72a6fe3235b6503b2e76ee3aa69151b539fda9feb5ee09 not found: ID does not exist" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.803089 4937 scope.go:117] "RemoveContainer" containerID="8ff9e6538a27044eaa3a44f33f87566cffc7cae195ab08b6b6f0eb611c987241" Feb 25 16:02:22 crc kubenswrapper[4937]: E0225 16:02:22.803407 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff9e6538a27044eaa3a44f33f87566cffc7cae195ab08b6b6f0eb611c987241\": container with ID starting with 8ff9e6538a27044eaa3a44f33f87566cffc7cae195ab08b6b6f0eb611c987241 not found: ID does not exist" containerID="8ff9e6538a27044eaa3a44f33f87566cffc7cae195ab08b6b6f0eb611c987241" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.803545 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff9e6538a27044eaa3a44f33f87566cffc7cae195ab08b6b6f0eb611c987241"} err="failed to get container status \"8ff9e6538a27044eaa3a44f33f87566cffc7cae195ab08b6b6f0eb611c987241\": rpc error: code = NotFound desc = could not find container \"8ff9e6538a27044eaa3a44f33f87566cffc7cae195ab08b6b6f0eb611c987241\": container with ID starting with 8ff9e6538a27044eaa3a44f33f87566cffc7cae195ab08b6b6f0eb611c987241 not found: ID does not exist" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.803655 4937 scope.go:117] "RemoveContainer" containerID="2ba8f41d8fda95489d38ef00cafd16ede50d5c099ce5bf393f4739f4478d0832" Feb 25 16:02:22 crc kubenswrapper[4937]: E0225 16:02:22.803999 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba8f41d8fda95489d38ef00cafd16ede50d5c099ce5bf393f4739f4478d0832\": container with ID starting with 2ba8f41d8fda95489d38ef00cafd16ede50d5c099ce5bf393f4739f4478d0832 not found: ID does not exist" containerID="2ba8f41d8fda95489d38ef00cafd16ede50d5c099ce5bf393f4739f4478d0832" Feb 25 16:02:22 crc kubenswrapper[4937]: I0225 16:02:22.804022 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba8f41d8fda95489d38ef00cafd16ede50d5c099ce5bf393f4739f4478d0832"} err="failed to get container status \"2ba8f41d8fda95489d38ef00cafd16ede50d5c099ce5bf393f4739f4478d0832\": rpc error: code = NotFound desc = could not find container \"2ba8f41d8fda95489d38ef00cafd16ede50d5c099ce5bf393f4739f4478d0832\": container with ID starting with 2ba8f41d8fda95489d38ef00cafd16ede50d5c099ce5bf393f4739f4478d0832 not found: ID does not exist" Feb 25 16:02:23 crc kubenswrapper[4937]: I0225 16:02:23.376506 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17373bf5-4311-4847-bfbf-3b346a214d8c" path="/var/lib/kubelet/pods/17373bf5-4311-4847-bfbf-3b346a214d8c/volumes" Feb 25 16:02:33 crc kubenswrapper[4937]: I0225 16:02:33.367402 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:33 crc kubenswrapper[4937]: I0225 16:02:33.368504 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" Feb 25 16:02:33 crc kubenswrapper[4937]: I0225 16:02:33.654072 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b"] Feb 25 16:02:33 crc kubenswrapper[4937]: W0225 16:02:33.658357 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cad8e2a_5182_4d59_9afa_c64ced98e87b.slice/crio-287c59c00f7fd889de91e6b3c479701f988a167ca3222d312dfa22fc296738ff WatchSource:0}: Error finding container 287c59c00f7fd889de91e6b3c479701f988a167ca3222d312dfa22fc296738ff: Status 404 returned error can't find the container with id 287c59c00f7fd889de91e6b3c479701f988a167ca3222d312dfa22fc296738ff Feb 25 16:02:33 crc kubenswrapper[4937]: I0225 16:02:33.832479 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" event={"ID":"1cad8e2a-5182-4d59-9afa-c64ced98e87b","Type":"ContainerStarted","Data":"287c59c00f7fd889de91e6b3c479701f988a167ca3222d312dfa22fc296738ff"} Feb 25 16:02:34 crc kubenswrapper[4937]: I0225 16:02:34.366599 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:34 crc kubenswrapper[4937]: I0225 16:02:34.366661 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" Feb 25 16:02:34 crc kubenswrapper[4937]: I0225 16:02:34.367509 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:34 crc kubenswrapper[4937]: I0225 16:02:34.367719 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" Feb 25 16:02:34 crc kubenswrapper[4937]: I0225 16:02:34.698289 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-p5kbh"] Feb 25 16:02:34 crc kubenswrapper[4937]: I0225 16:02:34.838708 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" event={"ID":"0eb822b0-826a-4b2d-9376-141a69ba37e5","Type":"ContainerStarted","Data":"166bfefb95c9b0bb4b0f95eeae1d8b3cf83fc1e99a36ed0394650d8f0ec8f3e6"} Feb 25 16:02:34 crc kubenswrapper[4937]: I0225 16:02:34.941950 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9"] Feb 25 16:02:34 crc kubenswrapper[4937]: W0225 16:02:34.948258 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e382460_7f5c_4a5d_ae1f_f1cbbdbfa6af.slice/crio-9231904eeb4b8d4855d2cd2498ab67b04d5ba27c7d5f48ec51c959c4008a13f4 WatchSource:0}: Error finding container 9231904eeb4b8d4855d2cd2498ab67b04d5ba27c7d5f48ec51c959c4008a13f4: Status 404 returned error can't find the container with id 9231904eeb4b8d4855d2cd2498ab67b04d5ba27c7d5f48ec51c959c4008a13f4 Feb 25 16:02:35 crc kubenswrapper[4937]: I0225 16:02:35.843293 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" event={"ID":"4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af","Type":"ContainerStarted","Data":"9231904eeb4b8d4855d2cd2498ab67b04d5ba27c7d5f48ec51c959c4008a13f4"} Feb 25 16:02:36 crc kubenswrapper[4937]: I0225 16:02:36.366674 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:36 crc kubenswrapper[4937]: I0225 16:02:36.366750 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:36 crc kubenswrapper[4937]: I0225 16:02:36.367112 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:02:36 crc kubenswrapper[4937]: I0225 16:02:36.367171 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" Feb 25 16:02:36 crc kubenswrapper[4937]: I0225 16:02:36.798137 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9"] Feb 25 16:02:36 crc kubenswrapper[4937]: W0225 16:02:36.806261 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f8315c1_97ca_4525_a1a8_afe98581f614.slice/crio-f4284e3147e138cf58912d4db5bb7326f59dc5a0f75ab94b5b38fb8a1f8002b1 WatchSource:0}: Error finding container f4284e3147e138cf58912d4db5bb7326f59dc5a0f75ab94b5b38fb8a1f8002b1: Status 404 returned error can't find the container with id f4284e3147e138cf58912d4db5bb7326f59dc5a0f75ab94b5b38fb8a1f8002b1 Feb 25 16:02:36 crc kubenswrapper[4937]: I0225 16:02:36.854794 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" event={"ID":"8f8315c1-97ca-4525-a1a8-afe98581f614","Type":"ContainerStarted","Data":"f4284e3147e138cf58912d4db5bb7326f59dc5a0f75ab94b5b38fb8a1f8002b1"} Feb 25 16:02:36 crc kubenswrapper[4937]: I0225 16:02:36.855232 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-prw69"] Feb 25 16:02:36 crc kubenswrapper[4937]: W0225 16:02:36.869684 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26437cd5_3ce5_4d7a_9b7f_9f983015f74d.slice/crio-2c8639e6aeb20de15d07e2a70e9e8299fe3565bccc4b57d467d72db44d232169 WatchSource:0}: Error finding container 2c8639e6aeb20de15d07e2a70e9e8299fe3565bccc4b57d467d72db44d232169: Status 404 returned error can't find the container with id 2c8639e6aeb20de15d07e2a70e9e8299fe3565bccc4b57d467d72db44d232169 Feb 25 16:02:37 crc kubenswrapper[4937]: I0225 16:02:37.860685 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-prw69" event={"ID":"26437cd5-3ce5-4d7a-9b7f-9f983015f74d","Type":"ContainerStarted","Data":"2c8639e6aeb20de15d07e2a70e9e8299fe3565bccc4b57d467d72db44d232169"} Feb 25 16:02:42 crc kubenswrapper[4937]: I0225 16:02:42.864792 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t9fdh" Feb 25 16:02:54 crc kubenswrapper[4937]: E0225 16:02:54.222770 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8" Feb 25 16:02:54 crc kubenswrapper[4937]: E0225 16:02:54.223443 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hdf9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5bf474d74f-prw69_openshift-operators(26437cd5-3ce5-4d7a-9b7f-9f983015f74d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 16:02:54 crc kubenswrapper[4937]: E0225 16:02:54.224686 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5bf474d74f-prw69" podUID="26437cd5-3ce5-4d7a-9b7f-9f983015f74d" Feb 25 16:02:54 crc kubenswrapper[4937]: I0225 16:02:54.960661 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" event={"ID":"4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af","Type":"ContainerStarted","Data":"c3483f30c7ed6ff38fcdd523c77ae34fcf63641f792738ce516bf6dfd3ad2ab1"} Feb 25 16:02:54 crc kubenswrapper[4937]: I0225 16:02:54.962530 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" event={"ID":"8f8315c1-97ca-4525-a1a8-afe98581f614","Type":"ContainerStarted","Data":"3b785566b271da0f956dac98da8c05587cb6fec2e68923254c4098bdcb5e8aa4"} Feb 25 16:02:54 crc kubenswrapper[4937]: I0225 16:02:54.964818 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" event={"ID":"1cad8e2a-5182-4d59-9afa-c64ced98e87b","Type":"ContainerStarted","Data":"b6606b157bdaa28b5ad1f6a42c87c1051a0ab4bc787ab9cc5602e56597acd451"} Feb 25 16:02:54 crc kubenswrapper[4937]: I0225 16:02:54.966779 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" event={"ID":"0eb822b0-826a-4b2d-9376-141a69ba37e5","Type":"ContainerStarted","Data":"798996ef6f5fd27f1d48e30fcc74206c1af3a291e2e42e2b500859fd87dcf617"} Feb 25 16:02:54 crc kubenswrapper[4937]: E0225 16:02:54.969996 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8\\\"\"" pod="openshift-operators/perses-operator-5bf474d74f-prw69" podUID="26437cd5-3ce5-4d7a-9b7f-9f983015f74d" Feb 25 16:02:54 crc kubenswrapper[4937]: I0225 16:02:54.992368 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-46sv9" podStartSLOduration=22.728901204 podStartE2EDuration="41.992346044s" podCreationTimestamp="2026-02-25 16:02:13 +0000 UTC" firstStartedPulling="2026-02-25 16:02:34.950566931 +0000 UTC m=+1005.963958821" lastFinishedPulling="2026-02-25 16:02:54.214011771 +0000 UTC m=+1025.227403661" observedRunningTime="2026-02-25 16:02:54.98701508 +0000 UTC m=+1026.000407010" watchObservedRunningTime="2026-02-25 16:02:54.992346044 +0000 UTC m=+1026.005737944" Feb 25 16:02:55 crc kubenswrapper[4937]: I0225 16:02:55.048648 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" podStartSLOduration=21.54826876 podStartE2EDuration="41.048612675s" podCreationTimestamp="2026-02-25 16:02:14 +0000 UTC" firstStartedPulling="2026-02-25 16:02:34.712437425 +0000 UTC m=+1005.725829315" lastFinishedPulling="2026-02-25 16:02:54.21278134 +0000 UTC m=+1025.226173230" observedRunningTime="2026-02-25 16:02:55.024075839 +0000 UTC m=+1026.037467769" watchObservedRunningTime="2026-02-25 16:02:55.048612675 +0000 UTC m=+1026.062004605" Feb 25 16:02:55 crc kubenswrapper[4937]: I0225 16:02:55.049386 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9" podStartSLOduration=24.644856856 podStartE2EDuration="42.049371724s" podCreationTimestamp="2026-02-25 16:02:13 +0000 UTC" firstStartedPulling="2026-02-25 16:02:36.809042691 +0000 UTC m=+1007.822434581" lastFinishedPulling="2026-02-25 16:02:54.213557559 +0000 UTC m=+1025.226949449" observedRunningTime="2026-02-25 16:02:55.043112027 +0000 UTC m=+1026.056503957" watchObservedRunningTime="2026-02-25 16:02:55.049371724 +0000 UTC m=+1026.062763664" Feb 25 16:02:55 crc kubenswrapper[4937]: I0225 16:02:55.090616 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b" podStartSLOduration=21.477031996 podStartE2EDuration="42.090594418s" podCreationTimestamp="2026-02-25 16:02:13 +0000 UTC" firstStartedPulling="2026-02-25 16:02:33.663190373 +0000 UTC m=+1004.676582273" lastFinishedPulling="2026-02-25 16:02:54.276752805 +0000 UTC m=+1025.290144695" observedRunningTime="2026-02-25 16:02:55.089182913 +0000 UTC m=+1026.102574823" watchObservedRunningTime="2026-02-25 16:02:55.090594418 +0000 UTC m=+1026.103986308" Feb 25 16:02:55 crc kubenswrapper[4937]: I0225 16:02:55.973547 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:02:55 crc kubenswrapper[4937]: I0225 16:02:55.977709 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-p5kbh" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.697964 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-p4vff"] Feb 25 16:03:05 crc kubenswrapper[4937]: E0225 16:03:05.698749 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17373bf5-4311-4847-bfbf-3b346a214d8c" containerName="registry-server" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.698766 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="17373bf5-4311-4847-bfbf-3b346a214d8c" containerName="registry-server" Feb 25 16:03:05 crc kubenswrapper[4937]: E0225 16:03:05.698785 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17373bf5-4311-4847-bfbf-3b346a214d8c" containerName="extract-content" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.698791 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="17373bf5-4311-4847-bfbf-3b346a214d8c" containerName="extract-content" Feb 25 16:03:05 crc kubenswrapper[4937]: E0225 16:03:05.698806 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17373bf5-4311-4847-bfbf-3b346a214d8c" containerName="extract-utilities" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.698813 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="17373bf5-4311-4847-bfbf-3b346a214d8c" containerName="extract-utilities" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.698902 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="17373bf5-4311-4847-bfbf-3b346a214d8c" containerName="registry-server" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.699320 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-p4vff" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.701355 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.701680 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.701836 4937 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-dc4hg" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.704531 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-tfn2h"] Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.705337 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-tfn2h" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.709837 4937 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-flr9t" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.731711 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-p4vff"] Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.735732 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-tfn2h"] Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.740272 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-m4kc4"] Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.741224 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-m4kc4" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.742067 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsrrt\" (UniqueName: \"kubernetes.io/projected/2e84eec9-8ff5-4f02-9596-e468e289dba0-kube-api-access-wsrrt\") pod \"cert-manager-webhook-687f57d79b-m4kc4\" (UID: \"2e84eec9-8ff5-4f02-9596-e468e289dba0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-m4kc4" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.742226 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmn2n\" (UniqueName: \"kubernetes.io/projected/92b2442a-04d9-4377-bef2-958d8a72543f-kube-api-access-jmn2n\") pod \"cert-manager-858654f9db-tfn2h\" (UID: \"92b2442a-04d9-4377-bef2-958d8a72543f\") " pod="cert-manager/cert-manager-858654f9db-tfn2h" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.742262 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46qc8\" (UniqueName: \"kubernetes.io/projected/99c5f86d-7755-49b0-bb68-7e9a338dbca7-kube-api-access-46qc8\") pod \"cert-manager-cainjector-cf98fcc89-p4vff\" (UID: \"99c5f86d-7755-49b0-bb68-7e9a338dbca7\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-p4vff" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.743258 4937 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-xmdxd" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.748927 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-m4kc4"] Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.843788 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsrrt\" (UniqueName: \"kubernetes.io/projected/2e84eec9-8ff5-4f02-9596-e468e289dba0-kube-api-access-wsrrt\") pod \"cert-manager-webhook-687f57d79b-m4kc4\" (UID: \"2e84eec9-8ff5-4f02-9596-e468e289dba0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-m4kc4" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.843866 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmn2n\" (UniqueName: \"kubernetes.io/projected/92b2442a-04d9-4377-bef2-958d8a72543f-kube-api-access-jmn2n\") pod \"cert-manager-858654f9db-tfn2h\" (UID: \"92b2442a-04d9-4377-bef2-958d8a72543f\") " pod="cert-manager/cert-manager-858654f9db-tfn2h" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.843889 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46qc8\" (UniqueName: \"kubernetes.io/projected/99c5f86d-7755-49b0-bb68-7e9a338dbca7-kube-api-access-46qc8\") pod \"cert-manager-cainjector-cf98fcc89-p4vff\" (UID: \"99c5f86d-7755-49b0-bb68-7e9a338dbca7\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-p4vff" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.860793 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsrrt\" (UniqueName: \"kubernetes.io/projected/2e84eec9-8ff5-4f02-9596-e468e289dba0-kube-api-access-wsrrt\") pod \"cert-manager-webhook-687f57d79b-m4kc4\" (UID: \"2e84eec9-8ff5-4f02-9596-e468e289dba0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-m4kc4" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.860959 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46qc8\" (UniqueName: \"kubernetes.io/projected/99c5f86d-7755-49b0-bb68-7e9a338dbca7-kube-api-access-46qc8\") pod \"cert-manager-cainjector-cf98fcc89-p4vff\" (UID: \"99c5f86d-7755-49b0-bb68-7e9a338dbca7\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-p4vff" Feb 25 16:03:05 crc kubenswrapper[4937]: I0225 16:03:05.861552 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmn2n\" (UniqueName: \"kubernetes.io/projected/92b2442a-04d9-4377-bef2-958d8a72543f-kube-api-access-jmn2n\") pod \"cert-manager-858654f9db-tfn2h\" (UID: \"92b2442a-04d9-4377-bef2-958d8a72543f\") " pod="cert-manager/cert-manager-858654f9db-tfn2h" Feb 25 16:03:06 crc kubenswrapper[4937]: I0225 16:03:06.022091 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-p4vff" Feb 25 16:03:06 crc kubenswrapper[4937]: I0225 16:03:06.033633 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-tfn2h" Feb 25 16:03:06 crc kubenswrapper[4937]: I0225 16:03:06.054157 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-m4kc4" Feb 25 16:03:06 crc kubenswrapper[4937]: I0225 16:03:06.343741 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-m4kc4"] Feb 25 16:03:06 crc kubenswrapper[4937]: I0225 16:03:06.485588 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-p4vff"] Feb 25 16:03:06 crc kubenswrapper[4937]: I0225 16:03:06.490938 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-tfn2h"] Feb 25 16:03:07 crc kubenswrapper[4937]: I0225 16:03:07.042561 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-m4kc4" event={"ID":"2e84eec9-8ff5-4f02-9596-e468e289dba0","Type":"ContainerStarted","Data":"27f703db4dffd9a84ed6d2fe2973b61dbac522158b52464b90fb0c271eaa6dca"} Feb 25 16:03:07 crc kubenswrapper[4937]: I0225 16:03:07.043505 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-p4vff" event={"ID":"99c5f86d-7755-49b0-bb68-7e9a338dbca7","Type":"ContainerStarted","Data":"61174dd6641b638aaf7ed368f3ef8ca2022af4fccbf07348a4cd2a25ea90967d"} Feb 25 16:03:07 crc kubenswrapper[4937]: I0225 16:03:07.044988 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-tfn2h" event={"ID":"92b2442a-04d9-4377-bef2-958d8a72543f","Type":"ContainerStarted","Data":"f15006e98b06e5426126f24169ee2664829027dab25f504f06265280c39d3111"} Feb 25 16:03:08 crc kubenswrapper[4937]: I0225 16:03:08.053911 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-prw69" event={"ID":"26437cd5-3ce5-4d7a-9b7f-9f983015f74d","Type":"ContainerStarted","Data":"5420017e9221d878b8d92391c7c1591a7484f2821110611d790cfae611477767"} Feb 25 16:03:08 crc kubenswrapper[4937]: I0225 16:03:08.054531 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:03:08 crc kubenswrapper[4937]: I0225 16:03:08.077185 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-prw69" podStartSLOduration=23.68005681 podStartE2EDuration="54.077169756s" podCreationTimestamp="2026-02-25 16:02:14 +0000 UTC" firstStartedPulling="2026-02-25 16:02:36.871081098 +0000 UTC m=+1007.884472988" lastFinishedPulling="2026-02-25 16:03:07.268194024 +0000 UTC m=+1038.281585934" observedRunningTime="2026-02-25 16:03:08.073292028 +0000 UTC m=+1039.086683928" watchObservedRunningTime="2026-02-25 16:03:08.077169756 +0000 UTC m=+1039.090561646" Feb 25 16:03:09 crc kubenswrapper[4937]: I0225 16:03:09.084533 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-p4vff" event={"ID":"99c5f86d-7755-49b0-bb68-7e9a338dbca7","Type":"ContainerStarted","Data":"d7352560262a2091b4bef7b11ec6f758ef6a565455f793a99543107ced118391"} Feb 25 16:03:09 crc kubenswrapper[4937]: I0225 16:03:09.109471 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-p4vff" podStartSLOduration=1.7643974199999999 podStartE2EDuration="4.109450052s" podCreationTimestamp="2026-02-25 16:03:05 +0000 UTC" firstStartedPulling="2026-02-25 16:03:06.486938627 +0000 UTC m=+1037.500330517" lastFinishedPulling="2026-02-25 16:03:08.831991259 +0000 UTC m=+1039.845383149" observedRunningTime="2026-02-25 16:03:09.107073832 +0000 UTC m=+1040.120465732" watchObservedRunningTime="2026-02-25 16:03:09.109450052 +0000 UTC m=+1040.122841962" Feb 25 16:03:12 crc kubenswrapper[4937]: I0225 16:03:12.106165 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-tfn2h" event={"ID":"92b2442a-04d9-4377-bef2-958d8a72543f","Type":"ContainerStarted","Data":"e4075bca79949c23adea4e1326f6193b3af5e7ac887ef0c3e1e64988a8dfb763"} Feb 25 16:03:12 crc kubenswrapper[4937]: I0225 16:03:12.109260 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-m4kc4" event={"ID":"2e84eec9-8ff5-4f02-9596-e468e289dba0","Type":"ContainerStarted","Data":"3de3e3d2132c09fd537f30b8d6f4b17c7ce74c17bf2b9d991cd85692bd39880a"} Feb 25 16:03:12 crc kubenswrapper[4937]: I0225 16:03:12.109438 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-m4kc4" Feb 25 16:03:12 crc kubenswrapper[4937]: I0225 16:03:12.165928 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-m4kc4" podStartSLOduration=2.517288634 podStartE2EDuration="7.165900535s" podCreationTimestamp="2026-02-25 16:03:05 +0000 UTC" firstStartedPulling="2026-02-25 16:03:06.345397145 +0000 UTC m=+1037.358789035" lastFinishedPulling="2026-02-25 16:03:10.994009046 +0000 UTC m=+1042.007400936" observedRunningTime="2026-02-25 16:03:12.16209377 +0000 UTC m=+1043.175485670" watchObservedRunningTime="2026-02-25 16:03:12.165900535 +0000 UTC m=+1043.179292455" Feb 25 16:03:12 crc kubenswrapper[4937]: I0225 16:03:12.172723 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-tfn2h" podStartSLOduration=2.715380636 podStartE2EDuration="7.172700076s" podCreationTimestamp="2026-02-25 16:03:05 +0000 UTC" firstStartedPulling="2026-02-25 16:03:06.488535667 +0000 UTC m=+1037.501927557" lastFinishedPulling="2026-02-25 16:03:10.945855097 +0000 UTC m=+1041.959246997" observedRunningTime="2026-02-25 16:03:12.129892532 +0000 UTC m=+1043.143284462" watchObservedRunningTime="2026-02-25 16:03:12.172700076 +0000 UTC m=+1043.186091976" Feb 25 16:03:14 crc kubenswrapper[4937]: I0225 16:03:14.529348 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-prw69" Feb 25 16:03:16 crc kubenswrapper[4937]: I0225 16:03:16.057138 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-m4kc4" Feb 25 16:03:40 crc kubenswrapper[4937]: I0225 16:03:40.239588 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv"] Feb 25 16:03:40 crc kubenswrapper[4937]: I0225 16:03:40.241306 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" Feb 25 16:03:40 crc kubenswrapper[4937]: I0225 16:03:40.244103 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 25 16:03:40 crc kubenswrapper[4937]: I0225 16:03:40.255452 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv"] Feb 25 16:03:40 crc kubenswrapper[4937]: I0225 16:03:40.400537 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47f85b62-41af-4e45-af61-33526ba0d867-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv\" (UID: \"47f85b62-41af-4e45-af61-33526ba0d867\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" Feb 25 16:03:40 crc kubenswrapper[4937]: I0225 16:03:40.400621 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrslm\" (UniqueName: \"kubernetes.io/projected/47f85b62-41af-4e45-af61-33526ba0d867-kube-api-access-rrslm\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv\" (UID: \"47f85b62-41af-4e45-af61-33526ba0d867\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" Feb 25 16:03:40 crc kubenswrapper[4937]: I0225 16:03:40.400707 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47f85b62-41af-4e45-af61-33526ba0d867-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv\" (UID: \"47f85b62-41af-4e45-af61-33526ba0d867\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" Feb 25 16:03:40 crc kubenswrapper[4937]: I0225 16:03:40.502265 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47f85b62-41af-4e45-af61-33526ba0d867-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv\" (UID: \"47f85b62-41af-4e45-af61-33526ba0d867\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" Feb 25 16:03:40 crc kubenswrapper[4937]: I0225 16:03:40.502351 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrslm\" (UniqueName: \"kubernetes.io/projected/47f85b62-41af-4e45-af61-33526ba0d867-kube-api-access-rrslm\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv\" (UID: \"47f85b62-41af-4e45-af61-33526ba0d867\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" Feb 25 16:03:40 crc kubenswrapper[4937]: I0225 16:03:40.502384 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47f85b62-41af-4e45-af61-33526ba0d867-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv\" (UID: \"47f85b62-41af-4e45-af61-33526ba0d867\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" Feb 25 16:03:40 crc kubenswrapper[4937]: I0225 16:03:40.502990 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47f85b62-41af-4e45-af61-33526ba0d867-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv\" (UID: \"47f85b62-41af-4e45-af61-33526ba0d867\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" Feb 25 16:03:40 crc kubenswrapper[4937]: I0225 16:03:40.503032 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47f85b62-41af-4e45-af61-33526ba0d867-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv\" (UID: \"47f85b62-41af-4e45-af61-33526ba0d867\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" Feb 25 16:03:40 crc kubenswrapper[4937]: I0225 16:03:40.531415 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrslm\" (UniqueName: \"kubernetes.io/projected/47f85b62-41af-4e45-af61-33526ba0d867-kube-api-access-rrslm\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv\" (UID: \"47f85b62-41af-4e45-af61-33526ba0d867\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" Feb 25 16:03:40 crc kubenswrapper[4937]: I0225 16:03:40.559131 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" Feb 25 16:03:41 crc kubenswrapper[4937]: I0225 16:03:41.517448 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:03:41 crc kubenswrapper[4937]: I0225 16:03:41.518171 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:03:41 crc kubenswrapper[4937]: I0225 16:03:41.651625 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv"] Feb 25 16:03:41 crc kubenswrapper[4937]: W0225 16:03:41.656764 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47f85b62_41af_4e45_af61_33526ba0d867.slice/crio-5c616dc0442d588407e62c66c3ec681e68f725b1a22c8afb4b22305c01323fe6 WatchSource:0}: Error finding container 5c616dc0442d588407e62c66c3ec681e68f725b1a22c8afb4b22305c01323fe6: Status 404 returned error can't find the container with id 5c616dc0442d588407e62c66c3ec681e68f725b1a22c8afb4b22305c01323fe6 Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.489466 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.490464 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.492343 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.492457 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.492466 4937 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-dp7vx" Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.502169 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.654761 4937 generic.go:334] "Generic (PLEG): container finished" podID="47f85b62-41af-4e45-af61-33526ba0d867" containerID="8d5f12b50ba2f80f5c75c82ad40f0f1eee8aff4fffff144855f3c47642fc1c2f" exitCode=0 Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.654820 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" event={"ID":"47f85b62-41af-4e45-af61-33526ba0d867","Type":"ContainerDied","Data":"8d5f12b50ba2f80f5c75c82ad40f0f1eee8aff4fffff144855f3c47642fc1c2f"} Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.654850 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" event={"ID":"47f85b62-41af-4e45-af61-33526ba0d867","Type":"ContainerStarted","Data":"5c616dc0442d588407e62c66c3ec681e68f725b1a22c8afb4b22305c01323fe6"} Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.745077 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lppdf\" (UniqueName: \"kubernetes.io/projected/5819358a-4e92-4926-920e-fec550072693-kube-api-access-lppdf\") pod \"minio\" (UID: \"5819358a-4e92-4926-920e-fec550072693\") " pod="minio-dev/minio" Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.745120 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9f750da2-9841-478e-867a-2b9ba7317655\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9f750da2-9841-478e-867a-2b9ba7317655\") pod \"minio\" (UID: \"5819358a-4e92-4926-920e-fec550072693\") " pod="minio-dev/minio" Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.846598 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lppdf\" (UniqueName: \"kubernetes.io/projected/5819358a-4e92-4926-920e-fec550072693-kube-api-access-lppdf\") pod \"minio\" (UID: \"5819358a-4e92-4926-920e-fec550072693\") " pod="minio-dev/minio" Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.846643 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9f750da2-9841-478e-867a-2b9ba7317655\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9f750da2-9841-478e-867a-2b9ba7317655\") pod \"minio\" (UID: \"5819358a-4e92-4926-920e-fec550072693\") " pod="minio-dev/minio" Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.850546 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.850599 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9f750da2-9841-478e-867a-2b9ba7317655\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9f750da2-9841-478e-867a-2b9ba7317655\") pod \"minio\" (UID: \"5819358a-4e92-4926-920e-fec550072693\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a21cc0d7cb001424dc6e4e9a015117b897972fbcaf23fccaf081d4ec035efc75/globalmount\"" pod="minio-dev/minio" Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.876635 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lppdf\" (UniqueName: \"kubernetes.io/projected/5819358a-4e92-4926-920e-fec550072693-kube-api-access-lppdf\") pod \"minio\" (UID: \"5819358a-4e92-4926-920e-fec550072693\") " pod="minio-dev/minio" Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.878851 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9f750da2-9841-478e-867a-2b9ba7317655\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9f750da2-9841-478e-867a-2b9ba7317655\") pod \"minio\" (UID: \"5819358a-4e92-4926-920e-fec550072693\") " pod="minio-dev/minio" Feb 25 16:03:42 crc kubenswrapper[4937]: I0225 16:03:42.953538 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 25 16:03:43 crc kubenswrapper[4937]: I0225 16:03:43.390364 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 25 16:03:43 crc kubenswrapper[4937]: W0225 16:03:43.397638 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5819358a_4e92_4926_920e_fec550072693.slice/crio-336de5502cc04ea40b8ae7779c4b19f77fadc35d03db0a7ec3c949202da7d81f WatchSource:0}: Error finding container 336de5502cc04ea40b8ae7779c4b19f77fadc35d03db0a7ec3c949202da7d81f: Status 404 returned error can't find the container with id 336de5502cc04ea40b8ae7779c4b19f77fadc35d03db0a7ec3c949202da7d81f Feb 25 16:03:43 crc kubenswrapper[4937]: I0225 16:03:43.661468 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"5819358a-4e92-4926-920e-fec550072693","Type":"ContainerStarted","Data":"336de5502cc04ea40b8ae7779c4b19f77fadc35d03db0a7ec3c949202da7d81f"} Feb 25 16:03:46 crc kubenswrapper[4937]: I0225 16:03:46.687352 4937 generic.go:334] "Generic (PLEG): container finished" podID="47f85b62-41af-4e45-af61-33526ba0d867" containerID="305d7fe269c5c0864e3cd9f73d7e24045b77122e379ba0e3f0cff28e69791cdc" exitCode=0 Feb 25 16:03:46 crc kubenswrapper[4937]: I0225 16:03:46.687504 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" event={"ID":"47f85b62-41af-4e45-af61-33526ba0d867","Type":"ContainerDied","Data":"305d7fe269c5c0864e3cd9f73d7e24045b77122e379ba0e3f0cff28e69791cdc"} Feb 25 16:03:47 crc kubenswrapper[4937]: I0225 16:03:47.698786 4937 generic.go:334] "Generic (PLEG): container finished" podID="47f85b62-41af-4e45-af61-33526ba0d867" containerID="59475116287f7517eca173abf1f55e3a44ba532f9faf1739d6abcd4aaabbeacf" exitCode=0 Feb 25 16:03:47 crc kubenswrapper[4937]: I0225 16:03:47.698865 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" event={"ID":"47f85b62-41af-4e45-af61-33526ba0d867","Type":"ContainerDied","Data":"59475116287f7517eca173abf1f55e3a44ba532f9faf1739d6abcd4aaabbeacf"} Feb 25 16:03:49 crc kubenswrapper[4937]: I0225 16:03:49.101055 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" Feb 25 16:03:49 crc kubenswrapper[4937]: I0225 16:03:49.232602 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47f85b62-41af-4e45-af61-33526ba0d867-bundle\") pod \"47f85b62-41af-4e45-af61-33526ba0d867\" (UID: \"47f85b62-41af-4e45-af61-33526ba0d867\") " Feb 25 16:03:49 crc kubenswrapper[4937]: I0225 16:03:49.232696 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47f85b62-41af-4e45-af61-33526ba0d867-util\") pod \"47f85b62-41af-4e45-af61-33526ba0d867\" (UID: \"47f85b62-41af-4e45-af61-33526ba0d867\") " Feb 25 16:03:49 crc kubenswrapper[4937]: I0225 16:03:49.232797 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrslm\" (UniqueName: \"kubernetes.io/projected/47f85b62-41af-4e45-af61-33526ba0d867-kube-api-access-rrslm\") pod \"47f85b62-41af-4e45-af61-33526ba0d867\" (UID: \"47f85b62-41af-4e45-af61-33526ba0d867\") " Feb 25 16:03:49 crc kubenswrapper[4937]: I0225 16:03:49.234561 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47f85b62-41af-4e45-af61-33526ba0d867-bundle" (OuterVolumeSpecName: "bundle") pod "47f85b62-41af-4e45-af61-33526ba0d867" (UID: "47f85b62-41af-4e45-af61-33526ba0d867"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:03:49 crc kubenswrapper[4937]: I0225 16:03:49.244857 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47f85b62-41af-4e45-af61-33526ba0d867-util" (OuterVolumeSpecName: "util") pod "47f85b62-41af-4e45-af61-33526ba0d867" (UID: "47f85b62-41af-4e45-af61-33526ba0d867"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:03:49 crc kubenswrapper[4937]: I0225 16:03:49.250607 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f85b62-41af-4e45-af61-33526ba0d867-kube-api-access-rrslm" (OuterVolumeSpecName: "kube-api-access-rrslm") pod "47f85b62-41af-4e45-af61-33526ba0d867" (UID: "47f85b62-41af-4e45-af61-33526ba0d867"). InnerVolumeSpecName "kube-api-access-rrslm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:03:49 crc kubenswrapper[4937]: I0225 16:03:49.333770 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrslm\" (UniqueName: \"kubernetes.io/projected/47f85b62-41af-4e45-af61-33526ba0d867-kube-api-access-rrslm\") on node \"crc\" DevicePath \"\"" Feb 25 16:03:49 crc kubenswrapper[4937]: I0225 16:03:49.333819 4937 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47f85b62-41af-4e45-af61-33526ba0d867-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:03:49 crc kubenswrapper[4937]: I0225 16:03:49.333830 4937 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47f85b62-41af-4e45-af61-33526ba0d867-util\") on node \"crc\" DevicePath \"\"" Feb 25 16:03:49 crc kubenswrapper[4937]: I0225 16:03:49.713404 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" event={"ID":"47f85b62-41af-4e45-af61-33526ba0d867","Type":"ContainerDied","Data":"5c616dc0442d588407e62c66c3ec681e68f725b1a22c8afb4b22305c01323fe6"} Feb 25 16:03:49 crc kubenswrapper[4937]: I0225 16:03:49.713444 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c616dc0442d588407e62c66c3ec681e68f725b1a22c8afb4b22305c01323fe6" Feb 25 16:03:49 crc kubenswrapper[4937]: I0225 16:03:49.713513 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv" Feb 25 16:03:51 crc kubenswrapper[4937]: I0225 16:03:51.733627 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"5819358a-4e92-4926-920e-fec550072693","Type":"ContainerStarted","Data":"0d8a689540e9df3b7551da82cf8b8b7413f960bac073c88455f171fd07bfdcbf"} Feb 25 16:03:51 crc kubenswrapper[4937]: I0225 16:03:51.767417 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=5.575760551 podStartE2EDuration="12.76737613s" podCreationTimestamp="2026-02-25 16:03:39 +0000 UTC" firstStartedPulling="2026-02-25 16:03:43.399735618 +0000 UTC m=+1074.413127508" lastFinishedPulling="2026-02-25 16:03:50.591351197 +0000 UTC m=+1081.604743087" observedRunningTime="2026-02-25 16:03:51.761177715 +0000 UTC m=+1082.774569605" watchObservedRunningTime="2026-02-25 16:03:51.76737613 +0000 UTC m=+1082.780768040" Feb 25 16:03:54 crc kubenswrapper[4937]: I0225 16:03:54.907588 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6"] Feb 25 16:03:54 crc kubenswrapper[4937]: E0225 16:03:54.908205 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f85b62-41af-4e45-af61-33526ba0d867" containerName="util" Feb 25 16:03:54 crc kubenswrapper[4937]: I0225 16:03:54.908217 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f85b62-41af-4e45-af61-33526ba0d867" containerName="util" Feb 25 16:03:54 crc kubenswrapper[4937]: E0225 16:03:54.908228 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f85b62-41af-4e45-af61-33526ba0d867" containerName="extract" Feb 25 16:03:54 crc kubenswrapper[4937]: I0225 16:03:54.908235 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f85b62-41af-4e45-af61-33526ba0d867" containerName="extract" Feb 25 16:03:54 crc kubenswrapper[4937]: E0225 16:03:54.908250 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f85b62-41af-4e45-af61-33526ba0d867" containerName="pull" Feb 25 16:03:54 crc kubenswrapper[4937]: I0225 16:03:54.908256 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f85b62-41af-4e45-af61-33526ba0d867" containerName="pull" Feb 25 16:03:54 crc kubenswrapper[4937]: I0225 16:03:54.908345 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f85b62-41af-4e45-af61-33526ba0d867" containerName="extract" Feb 25 16:03:54 crc kubenswrapper[4937]: I0225 16:03:54.908914 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:54 crc kubenswrapper[4937]: I0225 16:03:54.912180 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Feb 25 16:03:54 crc kubenswrapper[4937]: I0225 16:03:54.912232 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Feb 25 16:03:54 crc kubenswrapper[4937]: I0225 16:03:54.912760 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Feb 25 16:03:54 crc kubenswrapper[4937]: I0225 16:03:54.913040 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-ff755" Feb 25 16:03:54 crc kubenswrapper[4937]: I0225 16:03:54.913534 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Feb 25 16:03:54 crc kubenswrapper[4937]: I0225 16:03:54.913670 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.001341 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6"] Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.005589 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b72cc98b-e045-4ade-bdf7-c9929fc489fc-manager-config\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.005653 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b72cc98b-e045-4ade-bdf7-c9929fc489fc-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.005684 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b72cc98b-e045-4ade-bdf7-c9929fc489fc-apiservice-cert\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.005719 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b72cc98b-e045-4ade-bdf7-c9929fc489fc-webhook-cert\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.005894 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p9j8\" (UniqueName: \"kubernetes.io/projected/b72cc98b-e045-4ade-bdf7-c9929fc489fc-kube-api-access-7p9j8\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.106789 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b72cc98b-e045-4ade-bdf7-c9929fc489fc-manager-config\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.106849 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b72cc98b-e045-4ade-bdf7-c9929fc489fc-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.106885 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b72cc98b-e045-4ade-bdf7-c9929fc489fc-apiservice-cert\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.106907 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b72cc98b-e045-4ade-bdf7-c9929fc489fc-webhook-cert\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.106929 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p9j8\" (UniqueName: \"kubernetes.io/projected/b72cc98b-e045-4ade-bdf7-c9929fc489fc-kube-api-access-7p9j8\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.107689 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b72cc98b-e045-4ade-bdf7-c9929fc489fc-manager-config\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.115691 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b72cc98b-e045-4ade-bdf7-c9929fc489fc-webhook-cert\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.118180 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b72cc98b-e045-4ade-bdf7-c9929fc489fc-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.131844 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b72cc98b-e045-4ade-bdf7-c9929fc489fc-apiservice-cert\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.135028 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p9j8\" (UniqueName: \"kubernetes.io/projected/b72cc98b-e045-4ade-bdf7-c9929fc489fc-kube-api-access-7p9j8\") pod \"loki-operator-controller-manager-b5f46f5f7-zscl6\" (UID: \"b72cc98b-e045-4ade-bdf7-c9929fc489fc\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.223790 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.513758 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6"] Feb 25 16:03:55 crc kubenswrapper[4937]: W0225 16:03:55.519924 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb72cc98b_e045_4ade_bdf7_c9929fc489fc.slice/crio-518d9d3ba9394fbe1c7307b1acbf8a319e1bf4958ece8fd8bbf5d7cdc775250d WatchSource:0}: Error finding container 518d9d3ba9394fbe1c7307b1acbf8a319e1bf4958ece8fd8bbf5d7cdc775250d: Status 404 returned error can't find the container with id 518d9d3ba9394fbe1c7307b1acbf8a319e1bf4958ece8fd8bbf5d7cdc775250d Feb 25 16:03:55 crc kubenswrapper[4937]: I0225 16:03:55.758016 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" event={"ID":"b72cc98b-e045-4ade-bdf7-c9929fc489fc","Type":"ContainerStarted","Data":"518d9d3ba9394fbe1c7307b1acbf8a319e1bf4958ece8fd8bbf5d7cdc775250d"} Feb 25 16:04:00 crc kubenswrapper[4937]: I0225 16:04:00.187013 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533924-whqrx"] Feb 25 16:04:00 crc kubenswrapper[4937]: I0225 16:04:00.188429 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533924-whqrx" Feb 25 16:04:00 crc kubenswrapper[4937]: I0225 16:04:00.190805 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:04:00 crc kubenswrapper[4937]: I0225 16:04:00.195307 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:04:00 crc kubenswrapper[4937]: I0225 16:04:00.195702 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:04:00 crc kubenswrapper[4937]: I0225 16:04:00.201046 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533924-whqrx"] Feb 25 16:04:00 crc kubenswrapper[4937]: I0225 16:04:00.289519 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-789wb\" (UniqueName: \"kubernetes.io/projected/f8ea7960-4601-40d9-b43a-69a2799d10c8-kube-api-access-789wb\") pod \"auto-csr-approver-29533924-whqrx\" (UID: \"f8ea7960-4601-40d9-b43a-69a2799d10c8\") " pod="openshift-infra/auto-csr-approver-29533924-whqrx" Feb 25 16:04:00 crc kubenswrapper[4937]: I0225 16:04:00.390564 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-789wb\" (UniqueName: \"kubernetes.io/projected/f8ea7960-4601-40d9-b43a-69a2799d10c8-kube-api-access-789wb\") pod \"auto-csr-approver-29533924-whqrx\" (UID: \"f8ea7960-4601-40d9-b43a-69a2799d10c8\") " pod="openshift-infra/auto-csr-approver-29533924-whqrx" Feb 25 16:04:00 crc kubenswrapper[4937]: I0225 16:04:00.412264 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-789wb\" (UniqueName: \"kubernetes.io/projected/f8ea7960-4601-40d9-b43a-69a2799d10c8-kube-api-access-789wb\") pod \"auto-csr-approver-29533924-whqrx\" (UID: \"f8ea7960-4601-40d9-b43a-69a2799d10c8\") " pod="openshift-infra/auto-csr-approver-29533924-whqrx" Feb 25 16:04:00 crc kubenswrapper[4937]: I0225 16:04:00.517230 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533924-whqrx" Feb 25 16:04:00 crc kubenswrapper[4937]: I0225 16:04:00.729050 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533924-whqrx"] Feb 25 16:04:00 crc kubenswrapper[4937]: I0225 16:04:00.795370 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" event={"ID":"b72cc98b-e045-4ade-bdf7-c9929fc489fc","Type":"ContainerStarted","Data":"55a3210a5aa604daf81309a367973c5976eec7729be1b89a1dc55253468a1637"} Feb 25 16:04:00 crc kubenswrapper[4937]: I0225 16:04:00.796768 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533924-whqrx" event={"ID":"f8ea7960-4601-40d9-b43a-69a2799d10c8","Type":"ContainerStarted","Data":"e7c39cc6b6a1528cc8cdca8c085a532e01e07961d4e825a33fef3f1f055299dd"} Feb 25 16:04:02 crc kubenswrapper[4937]: I0225 16:04:02.813185 4937 generic.go:334] "Generic (PLEG): container finished" podID="f8ea7960-4601-40d9-b43a-69a2799d10c8" containerID="8b1057707b0a9b295b1fd01c4263122dfc4d18d96bdda5e89ca220104284838f" exitCode=0 Feb 25 16:04:02 crc kubenswrapper[4937]: I0225 16:04:02.813297 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533924-whqrx" event={"ID":"f8ea7960-4601-40d9-b43a-69a2799d10c8","Type":"ContainerDied","Data":"8b1057707b0a9b295b1fd01c4263122dfc4d18d96bdda5e89ca220104284838f"} Feb 25 16:04:06 crc kubenswrapper[4937]: I0225 16:04:06.460419 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533924-whqrx" Feb 25 16:04:06 crc kubenswrapper[4937]: I0225 16:04:06.589155 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-789wb\" (UniqueName: \"kubernetes.io/projected/f8ea7960-4601-40d9-b43a-69a2799d10c8-kube-api-access-789wb\") pod \"f8ea7960-4601-40d9-b43a-69a2799d10c8\" (UID: \"f8ea7960-4601-40d9-b43a-69a2799d10c8\") " Feb 25 16:04:06 crc kubenswrapper[4937]: I0225 16:04:06.597043 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8ea7960-4601-40d9-b43a-69a2799d10c8-kube-api-access-789wb" (OuterVolumeSpecName: "kube-api-access-789wb") pod "f8ea7960-4601-40d9-b43a-69a2799d10c8" (UID: "f8ea7960-4601-40d9-b43a-69a2799d10c8"). InnerVolumeSpecName "kube-api-access-789wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:04:06 crc kubenswrapper[4937]: I0225 16:04:06.690503 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-789wb\" (UniqueName: \"kubernetes.io/projected/f8ea7960-4601-40d9-b43a-69a2799d10c8-kube-api-access-789wb\") on node \"crc\" DevicePath \"\"" Feb 25 16:04:06 crc kubenswrapper[4937]: I0225 16:04:06.858105 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533924-whqrx" event={"ID":"f8ea7960-4601-40d9-b43a-69a2799d10c8","Type":"ContainerDied","Data":"e7c39cc6b6a1528cc8cdca8c085a532e01e07961d4e825a33fef3f1f055299dd"} Feb 25 16:04:06 crc kubenswrapper[4937]: I0225 16:04:06.858166 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7c39cc6b6a1528cc8cdca8c085a532e01e07961d4e825a33fef3f1f055299dd" Feb 25 16:04:06 crc kubenswrapper[4937]: I0225 16:04:06.858260 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533924-whqrx" Feb 25 16:04:07 crc kubenswrapper[4937]: I0225 16:04:07.523920 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533918-pv26w"] Feb 25 16:04:07 crc kubenswrapper[4937]: I0225 16:04:07.529611 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533918-pv26w"] Feb 25 16:04:07 crc kubenswrapper[4937]: I0225 16:04:07.867428 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" event={"ID":"b72cc98b-e045-4ade-bdf7-c9929fc489fc","Type":"ContainerStarted","Data":"b124dc7b8f775ee74ced8bf01dc5fcb9189712eb09d597007f02af744fd62838"} Feb 25 16:04:07 crc kubenswrapper[4937]: I0225 16:04:07.868551 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:04:07 crc kubenswrapper[4937]: I0225 16:04:07.872090 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" Feb 25 16:04:07 crc kubenswrapper[4937]: I0225 16:04:07.895279 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-b5f46f5f7-zscl6" podStartSLOduration=1.907623949 podStartE2EDuration="13.895250943s" podCreationTimestamp="2026-02-25 16:03:54 +0000 UTC" firstStartedPulling="2026-02-25 16:03:55.541202526 +0000 UTC m=+1086.554594416" lastFinishedPulling="2026-02-25 16:04:07.52882952 +0000 UTC m=+1098.542221410" observedRunningTime="2026-02-25 16:04:07.889243263 +0000 UTC m=+1098.902635173" watchObservedRunningTime="2026-02-25 16:04:07.895250943 +0000 UTC m=+1098.908642863" Feb 25 16:04:09 crc kubenswrapper[4937]: I0225 16:04:09.375029 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fef7f95-32ec-4dd5-b8a0-0868fef33d2f" path="/var/lib/kubelet/pods/5fef7f95-32ec-4dd5-b8a0-0868fef33d2f/volumes" Feb 25 16:04:11 crc kubenswrapper[4937]: I0225 16:04:11.498313 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:04:11 crc kubenswrapper[4937]: I0225 16:04:11.498388 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:04:12 crc kubenswrapper[4937]: I0225 16:04:12.940584 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rmvqf"] Feb 25 16:04:12 crc kubenswrapper[4937]: E0225 16:04:12.941070 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8ea7960-4601-40d9-b43a-69a2799d10c8" containerName="oc" Feb 25 16:04:12 crc kubenswrapper[4937]: I0225 16:04:12.941082 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8ea7960-4601-40d9-b43a-69a2799d10c8" containerName="oc" Feb 25 16:04:12 crc kubenswrapper[4937]: I0225 16:04:12.941199 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8ea7960-4601-40d9-b43a-69a2799d10c8" containerName="oc" Feb 25 16:04:12 crc kubenswrapper[4937]: I0225 16:04:12.941961 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:12 crc kubenswrapper[4937]: I0225 16:04:12.959477 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rmvqf"] Feb 25 16:04:13 crc kubenswrapper[4937]: I0225 16:04:13.128302 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhr2s\" (UniqueName: \"kubernetes.io/projected/79c0d99d-3837-4413-b054-42d1d9d7f244-kube-api-access-xhr2s\") pod \"community-operators-rmvqf\" (UID: \"79c0d99d-3837-4413-b054-42d1d9d7f244\") " pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:13 crc kubenswrapper[4937]: I0225 16:04:13.128346 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c0d99d-3837-4413-b054-42d1d9d7f244-utilities\") pod \"community-operators-rmvqf\" (UID: \"79c0d99d-3837-4413-b054-42d1d9d7f244\") " pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:13 crc kubenswrapper[4937]: I0225 16:04:13.128376 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c0d99d-3837-4413-b054-42d1d9d7f244-catalog-content\") pod \"community-operators-rmvqf\" (UID: \"79c0d99d-3837-4413-b054-42d1d9d7f244\") " pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:13 crc kubenswrapper[4937]: I0225 16:04:13.229050 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhr2s\" (UniqueName: \"kubernetes.io/projected/79c0d99d-3837-4413-b054-42d1d9d7f244-kube-api-access-xhr2s\") pod \"community-operators-rmvqf\" (UID: \"79c0d99d-3837-4413-b054-42d1d9d7f244\") " pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:13 crc kubenswrapper[4937]: I0225 16:04:13.229094 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c0d99d-3837-4413-b054-42d1d9d7f244-utilities\") pod \"community-operators-rmvqf\" (UID: \"79c0d99d-3837-4413-b054-42d1d9d7f244\") " pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:13 crc kubenswrapper[4937]: I0225 16:04:13.229124 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c0d99d-3837-4413-b054-42d1d9d7f244-catalog-content\") pod \"community-operators-rmvqf\" (UID: \"79c0d99d-3837-4413-b054-42d1d9d7f244\") " pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:13 crc kubenswrapper[4937]: I0225 16:04:13.229601 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c0d99d-3837-4413-b054-42d1d9d7f244-catalog-content\") pod \"community-operators-rmvqf\" (UID: \"79c0d99d-3837-4413-b054-42d1d9d7f244\") " pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:13 crc kubenswrapper[4937]: I0225 16:04:13.230140 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c0d99d-3837-4413-b054-42d1d9d7f244-utilities\") pod \"community-operators-rmvqf\" (UID: \"79c0d99d-3837-4413-b054-42d1d9d7f244\") " pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:13 crc kubenswrapper[4937]: I0225 16:04:13.248618 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhr2s\" (UniqueName: \"kubernetes.io/projected/79c0d99d-3837-4413-b054-42d1d9d7f244-kube-api-access-xhr2s\") pod \"community-operators-rmvqf\" (UID: \"79c0d99d-3837-4413-b054-42d1d9d7f244\") " pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:13 crc kubenswrapper[4937]: I0225 16:04:13.268223 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:13 crc kubenswrapper[4937]: I0225 16:04:13.549577 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rmvqf"] Feb 25 16:04:13 crc kubenswrapper[4937]: I0225 16:04:13.908019 4937 generic.go:334] "Generic (PLEG): container finished" podID="79c0d99d-3837-4413-b054-42d1d9d7f244" containerID="22d4f304108b3d63be423c69fd246154d6ddb9fe20949c5b6d3074d41faaca08" exitCode=0 Feb 25 16:04:13 crc kubenswrapper[4937]: I0225 16:04:13.908146 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmvqf" event={"ID":"79c0d99d-3837-4413-b054-42d1d9d7f244","Type":"ContainerDied","Data":"22d4f304108b3d63be423c69fd246154d6ddb9fe20949c5b6d3074d41faaca08"} Feb 25 16:04:13 crc kubenswrapper[4937]: I0225 16:04:13.908616 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmvqf" event={"ID":"79c0d99d-3837-4413-b054-42d1d9d7f244","Type":"ContainerStarted","Data":"a4c6d3c823a808c360f2308e76778ebacc6ae1064961cf64a175e89d772d56fd"} Feb 25 16:04:15 crc kubenswrapper[4937]: I0225 16:04:15.925939 4937 generic.go:334] "Generic (PLEG): container finished" podID="79c0d99d-3837-4413-b054-42d1d9d7f244" containerID="561052612f369aa6d01aa40081c116f124afc619e0fef1c39d15417f6abcc4f5" exitCode=0 Feb 25 16:04:15 crc kubenswrapper[4937]: I0225 16:04:15.926060 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmvqf" event={"ID":"79c0d99d-3837-4413-b054-42d1d9d7f244","Type":"ContainerDied","Data":"561052612f369aa6d01aa40081c116f124afc619e0fef1c39d15417f6abcc4f5"} Feb 25 16:04:16 crc kubenswrapper[4937]: I0225 16:04:16.937388 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmvqf" event={"ID":"79c0d99d-3837-4413-b054-42d1d9d7f244","Type":"ContainerStarted","Data":"25af5656086dddcc6c1ad457024e80844470145d5667661ae95c0e4e53f24a1d"} Feb 25 16:04:16 crc kubenswrapper[4937]: I0225 16:04:16.958471 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rmvqf" podStartSLOduration=2.367349604 podStartE2EDuration="4.95845375s" podCreationTimestamp="2026-02-25 16:04:12 +0000 UTC" firstStartedPulling="2026-02-25 16:04:13.909690043 +0000 UTC m=+1104.923081953" lastFinishedPulling="2026-02-25 16:04:16.500794209 +0000 UTC m=+1107.514186099" observedRunningTime="2026-02-25 16:04:16.954145282 +0000 UTC m=+1107.967537172" watchObservedRunningTime="2026-02-25 16:04:16.95845375 +0000 UTC m=+1107.971845640" Feb 25 16:04:21 crc kubenswrapper[4937]: I0225 16:04:21.827714 4937 scope.go:117] "RemoveContainer" containerID="d49c93d817419b11494637c1e12156aa071d1ed713591de7ead4976e7ad12c93" Feb 25 16:04:23 crc kubenswrapper[4937]: I0225 16:04:23.269431 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:23 crc kubenswrapper[4937]: I0225 16:04:23.269780 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:23 crc kubenswrapper[4937]: I0225 16:04:23.331814 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:24 crc kubenswrapper[4937]: I0225 16:04:24.036358 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:24 crc kubenswrapper[4937]: I0225 16:04:24.085538 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rmvqf"] Feb 25 16:04:27 crc kubenswrapper[4937]: I0225 16:04:26.006702 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rmvqf" podUID="79c0d99d-3837-4413-b054-42d1d9d7f244" containerName="registry-server" containerID="cri-o://25af5656086dddcc6c1ad457024e80844470145d5667661ae95c0e4e53f24a1d" gracePeriod=2 Feb 25 16:04:27 crc kubenswrapper[4937]: I0225 16:04:27.837687 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.017617 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c0d99d-3837-4413-b054-42d1d9d7f244-catalog-content\") pod \"79c0d99d-3837-4413-b054-42d1d9d7f244\" (UID: \"79c0d99d-3837-4413-b054-42d1d9d7f244\") " Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.017712 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhr2s\" (UniqueName: \"kubernetes.io/projected/79c0d99d-3837-4413-b054-42d1d9d7f244-kube-api-access-xhr2s\") pod \"79c0d99d-3837-4413-b054-42d1d9d7f244\" (UID: \"79c0d99d-3837-4413-b054-42d1d9d7f244\") " Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.017757 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c0d99d-3837-4413-b054-42d1d9d7f244-utilities\") pod \"79c0d99d-3837-4413-b054-42d1d9d7f244\" (UID: \"79c0d99d-3837-4413-b054-42d1d9d7f244\") " Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.018752 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79c0d99d-3837-4413-b054-42d1d9d7f244-utilities" (OuterVolumeSpecName: "utilities") pod "79c0d99d-3837-4413-b054-42d1d9d7f244" (UID: "79c0d99d-3837-4413-b054-42d1d9d7f244"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.020347 4937 generic.go:334] "Generic (PLEG): container finished" podID="79c0d99d-3837-4413-b054-42d1d9d7f244" containerID="25af5656086dddcc6c1ad457024e80844470145d5667661ae95c0e4e53f24a1d" exitCode=0 Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.020411 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmvqf" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.020403 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmvqf" event={"ID":"79c0d99d-3837-4413-b054-42d1d9d7f244","Type":"ContainerDied","Data":"25af5656086dddcc6c1ad457024e80844470145d5667661ae95c0e4e53f24a1d"} Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.020474 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmvqf" event={"ID":"79c0d99d-3837-4413-b054-42d1d9d7f244","Type":"ContainerDied","Data":"a4c6d3c823a808c360f2308e76778ebacc6ae1064961cf64a175e89d772d56fd"} Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.020521 4937 scope.go:117] "RemoveContainer" containerID="25af5656086dddcc6c1ad457024e80844470145d5667661ae95c0e4e53f24a1d" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.032744 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c0d99d-3837-4413-b054-42d1d9d7f244-kube-api-access-xhr2s" (OuterVolumeSpecName: "kube-api-access-xhr2s") pod "79c0d99d-3837-4413-b054-42d1d9d7f244" (UID: "79c0d99d-3837-4413-b054-42d1d9d7f244"). InnerVolumeSpecName "kube-api-access-xhr2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.058371 4937 scope.go:117] "RemoveContainer" containerID="561052612f369aa6d01aa40081c116f124afc619e0fef1c39d15417f6abcc4f5" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.077646 4937 scope.go:117] "RemoveContainer" containerID="22d4f304108b3d63be423c69fd246154d6ddb9fe20949c5b6d3074d41faaca08" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.094028 4937 scope.go:117] "RemoveContainer" containerID="25af5656086dddcc6c1ad457024e80844470145d5667661ae95c0e4e53f24a1d" Feb 25 16:04:28 crc kubenswrapper[4937]: E0225 16:04:28.094704 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25af5656086dddcc6c1ad457024e80844470145d5667661ae95c0e4e53f24a1d\": container with ID starting with 25af5656086dddcc6c1ad457024e80844470145d5667661ae95c0e4e53f24a1d not found: ID does not exist" containerID="25af5656086dddcc6c1ad457024e80844470145d5667661ae95c0e4e53f24a1d" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.094821 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25af5656086dddcc6c1ad457024e80844470145d5667661ae95c0e4e53f24a1d"} err="failed to get container status \"25af5656086dddcc6c1ad457024e80844470145d5667661ae95c0e4e53f24a1d\": rpc error: code = NotFound desc = could not find container \"25af5656086dddcc6c1ad457024e80844470145d5667661ae95c0e4e53f24a1d\": container with ID starting with 25af5656086dddcc6c1ad457024e80844470145d5667661ae95c0e4e53f24a1d not found: ID does not exist" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.094929 4937 scope.go:117] "RemoveContainer" containerID="561052612f369aa6d01aa40081c116f124afc619e0fef1c39d15417f6abcc4f5" Feb 25 16:04:28 crc kubenswrapper[4937]: E0225 16:04:28.095312 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"561052612f369aa6d01aa40081c116f124afc619e0fef1c39d15417f6abcc4f5\": container with ID starting with 561052612f369aa6d01aa40081c116f124afc619e0fef1c39d15417f6abcc4f5 not found: ID does not exist" containerID="561052612f369aa6d01aa40081c116f124afc619e0fef1c39d15417f6abcc4f5" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.095436 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561052612f369aa6d01aa40081c116f124afc619e0fef1c39d15417f6abcc4f5"} err="failed to get container status \"561052612f369aa6d01aa40081c116f124afc619e0fef1c39d15417f6abcc4f5\": rpc error: code = NotFound desc = could not find container \"561052612f369aa6d01aa40081c116f124afc619e0fef1c39d15417f6abcc4f5\": container with ID starting with 561052612f369aa6d01aa40081c116f124afc619e0fef1c39d15417f6abcc4f5 not found: ID does not exist" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.095567 4937 scope.go:117] "RemoveContainer" containerID="22d4f304108b3d63be423c69fd246154d6ddb9fe20949c5b6d3074d41faaca08" Feb 25 16:04:28 crc kubenswrapper[4937]: E0225 16:04:28.096031 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22d4f304108b3d63be423c69fd246154d6ddb9fe20949c5b6d3074d41faaca08\": container with ID starting with 22d4f304108b3d63be423c69fd246154d6ddb9fe20949c5b6d3074d41faaca08 not found: ID does not exist" containerID="22d4f304108b3d63be423c69fd246154d6ddb9fe20949c5b6d3074d41faaca08" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.096134 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22d4f304108b3d63be423c69fd246154d6ddb9fe20949c5b6d3074d41faaca08"} err="failed to get container status \"22d4f304108b3d63be423c69fd246154d6ddb9fe20949c5b6d3074d41faaca08\": rpc error: code = NotFound desc = could not find container \"22d4f304108b3d63be423c69fd246154d6ddb9fe20949c5b6d3074d41faaca08\": container with ID starting with 22d4f304108b3d63be423c69fd246154d6ddb9fe20949c5b6d3074d41faaca08 not found: ID does not exist" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.119598 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhr2s\" (UniqueName: \"kubernetes.io/projected/79c0d99d-3837-4413-b054-42d1d9d7f244-kube-api-access-xhr2s\") on node \"crc\" DevicePath \"\"" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.119877 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79c0d99d-3837-4413-b054-42d1d9d7f244-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.604456 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79c0d99d-3837-4413-b054-42d1d9d7f244-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79c0d99d-3837-4413-b054-42d1d9d7f244" (UID: "79c0d99d-3837-4413-b054-42d1d9d7f244"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.626211 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79c0d99d-3837-4413-b054-42d1d9d7f244-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.652209 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rmvqf"] Feb 25 16:04:28 crc kubenswrapper[4937]: I0225 16:04:28.661554 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rmvqf"] Feb 25 16:04:29 crc kubenswrapper[4937]: I0225 16:04:29.374381 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79c0d99d-3837-4413-b054-42d1d9d7f244" path="/var/lib/kubelet/pods/79c0d99d-3837-4413-b054-42d1d9d7f244/volumes" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.260264 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nnzrv"] Feb 25 16:04:35 crc kubenswrapper[4937]: E0225 16:04:35.261089 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c0d99d-3837-4413-b054-42d1d9d7f244" containerName="extract-content" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.261108 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c0d99d-3837-4413-b054-42d1d9d7f244" containerName="extract-content" Feb 25 16:04:35 crc kubenswrapper[4937]: E0225 16:04:35.261131 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c0d99d-3837-4413-b054-42d1d9d7f244" containerName="registry-server" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.261143 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c0d99d-3837-4413-b054-42d1d9d7f244" containerName="registry-server" Feb 25 16:04:35 crc kubenswrapper[4937]: E0225 16:04:35.261165 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c0d99d-3837-4413-b054-42d1d9d7f244" containerName="extract-utilities" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.261179 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c0d99d-3837-4413-b054-42d1d9d7f244" containerName="extract-utilities" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.261365 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c0d99d-3837-4413-b054-42d1d9d7f244" containerName="registry-server" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.262812 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.280272 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nnzrv"] Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.410826 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11258a71-1d31-4829-8f66-afb999a9611d-catalog-content\") pod \"certified-operators-nnzrv\" (UID: \"11258a71-1d31-4829-8f66-afb999a9611d\") " pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.410885 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fdpq\" (UniqueName: \"kubernetes.io/projected/11258a71-1d31-4829-8f66-afb999a9611d-kube-api-access-8fdpq\") pod \"certified-operators-nnzrv\" (UID: \"11258a71-1d31-4829-8f66-afb999a9611d\") " pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.410987 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11258a71-1d31-4829-8f66-afb999a9611d-utilities\") pod \"certified-operators-nnzrv\" (UID: \"11258a71-1d31-4829-8f66-afb999a9611d\") " pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.511919 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11258a71-1d31-4829-8f66-afb999a9611d-utilities\") pod \"certified-operators-nnzrv\" (UID: \"11258a71-1d31-4829-8f66-afb999a9611d\") " pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.512040 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11258a71-1d31-4829-8f66-afb999a9611d-catalog-content\") pod \"certified-operators-nnzrv\" (UID: \"11258a71-1d31-4829-8f66-afb999a9611d\") " pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.512068 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fdpq\" (UniqueName: \"kubernetes.io/projected/11258a71-1d31-4829-8f66-afb999a9611d-kube-api-access-8fdpq\") pod \"certified-operators-nnzrv\" (UID: \"11258a71-1d31-4829-8f66-afb999a9611d\") " pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.512526 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11258a71-1d31-4829-8f66-afb999a9611d-utilities\") pod \"certified-operators-nnzrv\" (UID: \"11258a71-1d31-4829-8f66-afb999a9611d\") " pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.512770 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11258a71-1d31-4829-8f66-afb999a9611d-catalog-content\") pod \"certified-operators-nnzrv\" (UID: \"11258a71-1d31-4829-8f66-afb999a9611d\") " pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.535926 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fdpq\" (UniqueName: \"kubernetes.io/projected/11258a71-1d31-4829-8f66-afb999a9611d-kube-api-access-8fdpq\") pod \"certified-operators-nnzrv\" (UID: \"11258a71-1d31-4829-8f66-afb999a9611d\") " pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:35 crc kubenswrapper[4937]: I0225 16:04:35.584897 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:36 crc kubenswrapper[4937]: I0225 16:04:36.085869 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nnzrv"] Feb 25 16:04:37 crc kubenswrapper[4937]: I0225 16:04:37.084260 4937 generic.go:334] "Generic (PLEG): container finished" podID="11258a71-1d31-4829-8f66-afb999a9611d" containerID="6504cd85af459f3157ef1f4dcc5ea86d8907fc40a3de7ef5dfe0ea388b0078ad" exitCode=0 Feb 25 16:04:37 crc kubenswrapper[4937]: I0225 16:04:37.084419 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnzrv" event={"ID":"11258a71-1d31-4829-8f66-afb999a9611d","Type":"ContainerDied","Data":"6504cd85af459f3157ef1f4dcc5ea86d8907fc40a3de7ef5dfe0ea388b0078ad"} Feb 25 16:04:37 crc kubenswrapper[4937]: I0225 16:04:37.084626 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnzrv" event={"ID":"11258a71-1d31-4829-8f66-afb999a9611d","Type":"ContainerStarted","Data":"e895e9a47b1603c51593d80801ee9a191a1cb4e8389a126827561261d518c55c"} Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.074857 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k"] Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.076563 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.079059 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.086895 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k"] Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.146824 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42fd7b47-664a-4b65-8804-417a7fdd9b2f-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k\" (UID: \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.146904 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx8dw\" (UniqueName: \"kubernetes.io/projected/42fd7b47-664a-4b65-8804-417a7fdd9b2f-kube-api-access-kx8dw\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k\" (UID: \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.146930 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42fd7b47-664a-4b65-8804-417a7fdd9b2f-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k\" (UID: \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.247698 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42fd7b47-664a-4b65-8804-417a7fdd9b2f-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k\" (UID: \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.247783 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx8dw\" (UniqueName: \"kubernetes.io/projected/42fd7b47-664a-4b65-8804-417a7fdd9b2f-kube-api-access-kx8dw\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k\" (UID: \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.247807 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42fd7b47-664a-4b65-8804-417a7fdd9b2f-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k\" (UID: \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.248284 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42fd7b47-664a-4b65-8804-417a7fdd9b2f-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k\" (UID: \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.248335 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42fd7b47-664a-4b65-8804-417a7fdd9b2f-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k\" (UID: \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.276727 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx8dw\" (UniqueName: \"kubernetes.io/projected/42fd7b47-664a-4b65-8804-417a7fdd9b2f-kube-api-access-kx8dw\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k\" (UID: \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.403841 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" Feb 25 16:04:38 crc kubenswrapper[4937]: I0225 16:04:38.639638 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k"] Feb 25 16:04:39 crc kubenswrapper[4937]: I0225 16:04:39.101753 4937 generic.go:334] "Generic (PLEG): container finished" podID="11258a71-1d31-4829-8f66-afb999a9611d" containerID="6a531df34100d25b73192e5f8600ff54920e3ab87c4bab7f0810e56a41e5bded" exitCode=0 Feb 25 16:04:39 crc kubenswrapper[4937]: I0225 16:04:39.101887 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnzrv" event={"ID":"11258a71-1d31-4829-8f66-afb999a9611d","Type":"ContainerDied","Data":"6a531df34100d25b73192e5f8600ff54920e3ab87c4bab7f0810e56a41e5bded"} Feb 25 16:04:39 crc kubenswrapper[4937]: I0225 16:04:39.117731 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" event={"ID":"42fd7b47-664a-4b65-8804-417a7fdd9b2f","Type":"ContainerStarted","Data":"9ceebf9103b474099e7dfc41402b3cdd59b7fcd092fe7baf66a75f219a18506d"} Feb 25 16:04:39 crc kubenswrapper[4937]: I0225 16:04:39.117809 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" event={"ID":"42fd7b47-664a-4b65-8804-417a7fdd9b2f","Type":"ContainerStarted","Data":"6061507c6159dec068a83a45777916d0b1f419e84121a441c396a1068da7abbe"} Feb 25 16:04:40 crc kubenswrapper[4937]: I0225 16:04:40.126846 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnzrv" event={"ID":"11258a71-1d31-4829-8f66-afb999a9611d","Type":"ContainerStarted","Data":"ebd1cf1e87984bc6f9ca2e3042b5fde6ef5b523ce88b3d5ff734e3c731947d5a"} Feb 25 16:04:40 crc kubenswrapper[4937]: I0225 16:04:40.128983 4937 generic.go:334] "Generic (PLEG): container finished" podID="42fd7b47-664a-4b65-8804-417a7fdd9b2f" containerID="9ceebf9103b474099e7dfc41402b3cdd59b7fcd092fe7baf66a75f219a18506d" exitCode=0 Feb 25 16:04:40 crc kubenswrapper[4937]: I0225 16:04:40.129032 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" event={"ID":"42fd7b47-664a-4b65-8804-417a7fdd9b2f","Type":"ContainerDied","Data":"9ceebf9103b474099e7dfc41402b3cdd59b7fcd092fe7baf66a75f219a18506d"} Feb 25 16:04:40 crc kubenswrapper[4937]: I0225 16:04:40.155945 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nnzrv" podStartSLOduration=2.64540012 podStartE2EDuration="5.155924594s" podCreationTimestamp="2026-02-25 16:04:35 +0000 UTC" firstStartedPulling="2026-02-25 16:04:37.08654479 +0000 UTC m=+1128.099936680" lastFinishedPulling="2026-02-25 16:04:39.597069244 +0000 UTC m=+1130.610461154" observedRunningTime="2026-02-25 16:04:40.15496008 +0000 UTC m=+1131.168351990" watchObservedRunningTime="2026-02-25 16:04:40.155924594 +0000 UTC m=+1131.169316504" Feb 25 16:04:41 crc kubenswrapper[4937]: I0225 16:04:41.494554 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:04:41 crc kubenswrapper[4937]: I0225 16:04:41.494823 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:04:41 crc kubenswrapper[4937]: I0225 16:04:41.494860 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 16:04:41 crc kubenswrapper[4937]: I0225 16:04:41.495245 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3de247f04ff3abf939866313cfef1da7c2e6ae7d14d3da3ecda7ba81bfc35f7"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 16:04:41 crc kubenswrapper[4937]: I0225 16:04:41.495295 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://a3de247f04ff3abf939866313cfef1da7c2e6ae7d14d3da3ecda7ba81bfc35f7" gracePeriod=600 Feb 25 16:04:42 crc kubenswrapper[4937]: I0225 16:04:42.143357 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="a3de247f04ff3abf939866313cfef1da7c2e6ae7d14d3da3ecda7ba81bfc35f7" exitCode=0 Feb 25 16:04:42 crc kubenswrapper[4937]: I0225 16:04:42.143429 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"a3de247f04ff3abf939866313cfef1da7c2e6ae7d14d3da3ecda7ba81bfc35f7"} Feb 25 16:04:42 crc kubenswrapper[4937]: I0225 16:04:42.143857 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"82d7f39c6bdd0c324e2d3b37551824fca9f991542926e3b5f5cc2a5a3ef74dfb"} Feb 25 16:04:42 crc kubenswrapper[4937]: I0225 16:04:42.143883 4937 scope.go:117] "RemoveContainer" containerID="95cb15cff9f98a839dbce0c2049d37638146e1c360cc03bdc2f2c9958a469258" Feb 25 16:04:43 crc kubenswrapper[4937]: I0225 16:04:43.150975 4937 generic.go:334] "Generic (PLEG): container finished" podID="42fd7b47-664a-4b65-8804-417a7fdd9b2f" containerID="1552a35c0113e5c95c14e481d6a2baf3571a937b8bb9aff40c48a6ea8c9cd545" exitCode=0 Feb 25 16:04:43 crc kubenswrapper[4937]: I0225 16:04:43.151092 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" event={"ID":"42fd7b47-664a-4b65-8804-417a7fdd9b2f","Type":"ContainerDied","Data":"1552a35c0113e5c95c14e481d6a2baf3571a937b8bb9aff40c48a6ea8c9cd545"} Feb 25 16:04:44 crc kubenswrapper[4937]: I0225 16:04:44.165932 4937 generic.go:334] "Generic (PLEG): container finished" podID="42fd7b47-664a-4b65-8804-417a7fdd9b2f" containerID="162e7b1e2fd1d8f4f4fd4fba1efffd93ecfa155aa569cd345d6135b5d8695e38" exitCode=0 Feb 25 16:04:44 crc kubenswrapper[4937]: I0225 16:04:44.166020 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" event={"ID":"42fd7b47-664a-4b65-8804-417a7fdd9b2f","Type":"ContainerDied","Data":"162e7b1e2fd1d8f4f4fd4fba1efffd93ecfa155aa569cd345d6135b5d8695e38"} Feb 25 16:04:45 crc kubenswrapper[4937]: I0225 16:04:45.470401 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" Feb 25 16:04:45 crc kubenswrapper[4937]: I0225 16:04:45.570133 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42fd7b47-664a-4b65-8804-417a7fdd9b2f-util\") pod \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\" (UID: \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\") " Feb 25 16:04:45 crc kubenswrapper[4937]: I0225 16:04:45.570233 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42fd7b47-664a-4b65-8804-417a7fdd9b2f-bundle\") pod \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\" (UID: \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\") " Feb 25 16:04:45 crc kubenswrapper[4937]: I0225 16:04:45.570331 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx8dw\" (UniqueName: \"kubernetes.io/projected/42fd7b47-664a-4b65-8804-417a7fdd9b2f-kube-api-access-kx8dw\") pod \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\" (UID: \"42fd7b47-664a-4b65-8804-417a7fdd9b2f\") " Feb 25 16:04:45 crc kubenswrapper[4937]: I0225 16:04:45.570840 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42fd7b47-664a-4b65-8804-417a7fdd9b2f-bundle" (OuterVolumeSpecName: "bundle") pod "42fd7b47-664a-4b65-8804-417a7fdd9b2f" (UID: "42fd7b47-664a-4b65-8804-417a7fdd9b2f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:04:45 crc kubenswrapper[4937]: I0225 16:04:45.577657 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42fd7b47-664a-4b65-8804-417a7fdd9b2f-kube-api-access-kx8dw" (OuterVolumeSpecName: "kube-api-access-kx8dw") pod "42fd7b47-664a-4b65-8804-417a7fdd9b2f" (UID: "42fd7b47-664a-4b65-8804-417a7fdd9b2f"). InnerVolumeSpecName "kube-api-access-kx8dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:04:45 crc kubenswrapper[4937]: I0225 16:04:45.582057 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42fd7b47-664a-4b65-8804-417a7fdd9b2f-util" (OuterVolumeSpecName: "util") pod "42fd7b47-664a-4b65-8804-417a7fdd9b2f" (UID: "42fd7b47-664a-4b65-8804-417a7fdd9b2f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:04:45 crc kubenswrapper[4937]: I0225 16:04:45.585790 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:45 crc kubenswrapper[4937]: I0225 16:04:45.586147 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:45 crc kubenswrapper[4937]: I0225 16:04:45.634228 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:45 crc kubenswrapper[4937]: I0225 16:04:45.672057 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx8dw\" (UniqueName: \"kubernetes.io/projected/42fd7b47-664a-4b65-8804-417a7fdd9b2f-kube-api-access-kx8dw\") on node \"crc\" DevicePath \"\"" Feb 25 16:04:45 crc kubenswrapper[4937]: I0225 16:04:45.672360 4937 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42fd7b47-664a-4b65-8804-417a7fdd9b2f-util\") on node \"crc\" DevicePath \"\"" Feb 25 16:04:45 crc kubenswrapper[4937]: I0225 16:04:45.672369 4937 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42fd7b47-664a-4b65-8804-417a7fdd9b2f-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:04:46 crc kubenswrapper[4937]: I0225 16:04:46.180521 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" event={"ID":"42fd7b47-664a-4b65-8804-417a7fdd9b2f","Type":"ContainerDied","Data":"6061507c6159dec068a83a45777916d0b1f419e84121a441c396a1068da7abbe"} Feb 25 16:04:46 crc kubenswrapper[4937]: I0225 16:04:46.180559 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6061507c6159dec068a83a45777916d0b1f419e84121a441c396a1068da7abbe" Feb 25 16:04:46 crc kubenswrapper[4937]: I0225 16:04:46.180615 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k" Feb 25 16:04:46 crc kubenswrapper[4937]: I0225 16:04:46.219369 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.033972 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nnzrv"] Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.203538 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nnzrv" podUID="11258a71-1d31-4829-8f66-afb999a9611d" containerName="registry-server" containerID="cri-o://ebd1cf1e87984bc6f9ca2e3042b5fde6ef5b523ce88b3d5ff734e3c731947d5a" gracePeriod=2 Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.472560 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-c4wvk"] Feb 25 16:04:49 crc kubenswrapper[4937]: E0225 16:04:49.472879 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42fd7b47-664a-4b65-8804-417a7fdd9b2f" containerName="pull" Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.472906 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fd7b47-664a-4b65-8804-417a7fdd9b2f" containerName="pull" Feb 25 16:04:49 crc kubenswrapper[4937]: E0225 16:04:49.472927 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42fd7b47-664a-4b65-8804-417a7fdd9b2f" containerName="extract" Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.472939 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fd7b47-664a-4b65-8804-417a7fdd9b2f" containerName="extract" Feb 25 16:04:49 crc kubenswrapper[4937]: E0225 16:04:49.472955 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42fd7b47-664a-4b65-8804-417a7fdd9b2f" containerName="util" Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.472966 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fd7b47-664a-4b65-8804-417a7fdd9b2f" containerName="util" Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.473130 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="42fd7b47-664a-4b65-8804-417a7fdd9b2f" containerName="extract" Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.473719 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-c4wvk" Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.475878 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-pqxs5" Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.476171 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.476562 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.487217 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-c4wvk"] Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.532921 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7prx7\" (UniqueName: \"kubernetes.io/projected/086feb17-6360-4d8f-a766-78607300c491-kube-api-access-7prx7\") pod \"nmstate-operator-75c5dccd6c-c4wvk\" (UID: \"086feb17-6360-4d8f-a766-78607300c491\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-c4wvk" Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.633943 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7prx7\" (UniqueName: \"kubernetes.io/projected/086feb17-6360-4d8f-a766-78607300c491-kube-api-access-7prx7\") pod \"nmstate-operator-75c5dccd6c-c4wvk\" (UID: \"086feb17-6360-4d8f-a766-78607300c491\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-c4wvk" Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.652684 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7prx7\" (UniqueName: \"kubernetes.io/projected/086feb17-6360-4d8f-a766-78607300c491-kube-api-access-7prx7\") pod \"nmstate-operator-75c5dccd6c-c4wvk\" (UID: \"086feb17-6360-4d8f-a766-78607300c491\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-c4wvk" Feb 25 16:04:49 crc kubenswrapper[4937]: I0225 16:04:49.789300 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-c4wvk" Feb 25 16:04:50 crc kubenswrapper[4937]: I0225 16:04:50.081386 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-c4wvk"] Feb 25 16:04:50 crc kubenswrapper[4937]: I0225 16:04:50.208990 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-c4wvk" event={"ID":"086feb17-6360-4d8f-a766-78607300c491","Type":"ContainerStarted","Data":"7c325cd6c4143f24377d6fe38fba394c6a8ba546b64e31027f3d54cf5720b419"} Feb 25 16:04:50 crc kubenswrapper[4937]: I0225 16:04:50.792551 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:50 crc kubenswrapper[4937]: I0225 16:04:50.948623 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11258a71-1d31-4829-8f66-afb999a9611d-catalog-content\") pod \"11258a71-1d31-4829-8f66-afb999a9611d\" (UID: \"11258a71-1d31-4829-8f66-afb999a9611d\") " Feb 25 16:04:50 crc kubenswrapper[4937]: I0225 16:04:50.948757 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fdpq\" (UniqueName: \"kubernetes.io/projected/11258a71-1d31-4829-8f66-afb999a9611d-kube-api-access-8fdpq\") pod \"11258a71-1d31-4829-8f66-afb999a9611d\" (UID: \"11258a71-1d31-4829-8f66-afb999a9611d\") " Feb 25 16:04:50 crc kubenswrapper[4937]: I0225 16:04:50.950017 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11258a71-1d31-4829-8f66-afb999a9611d-utilities\") pod \"11258a71-1d31-4829-8f66-afb999a9611d\" (UID: \"11258a71-1d31-4829-8f66-afb999a9611d\") " Feb 25 16:04:50 crc kubenswrapper[4937]: I0225 16:04:50.950771 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11258a71-1d31-4829-8f66-afb999a9611d-utilities" (OuterVolumeSpecName: "utilities") pod "11258a71-1d31-4829-8f66-afb999a9611d" (UID: "11258a71-1d31-4829-8f66-afb999a9611d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:04:50 crc kubenswrapper[4937]: I0225 16:04:50.951048 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11258a71-1d31-4829-8f66-afb999a9611d-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:04:50 crc kubenswrapper[4937]: I0225 16:04:50.968686 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11258a71-1d31-4829-8f66-afb999a9611d-kube-api-access-8fdpq" (OuterVolumeSpecName: "kube-api-access-8fdpq") pod "11258a71-1d31-4829-8f66-afb999a9611d" (UID: "11258a71-1d31-4829-8f66-afb999a9611d"). InnerVolumeSpecName "kube-api-access-8fdpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.010306 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11258a71-1d31-4829-8f66-afb999a9611d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11258a71-1d31-4829-8f66-afb999a9611d" (UID: "11258a71-1d31-4829-8f66-afb999a9611d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.052341 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11258a71-1d31-4829-8f66-afb999a9611d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.052381 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fdpq\" (UniqueName: \"kubernetes.io/projected/11258a71-1d31-4829-8f66-afb999a9611d-kube-api-access-8fdpq\") on node \"crc\" DevicePath \"\"" Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.217128 4937 generic.go:334] "Generic (PLEG): container finished" podID="11258a71-1d31-4829-8f66-afb999a9611d" containerID="ebd1cf1e87984bc6f9ca2e3042b5fde6ef5b523ce88b3d5ff734e3c731947d5a" exitCode=0 Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.217172 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnzrv" event={"ID":"11258a71-1d31-4829-8f66-afb999a9611d","Type":"ContainerDied","Data":"ebd1cf1e87984bc6f9ca2e3042b5fde6ef5b523ce88b3d5ff734e3c731947d5a"} Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.217204 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnzrv" Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.217215 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnzrv" event={"ID":"11258a71-1d31-4829-8f66-afb999a9611d","Type":"ContainerDied","Data":"e895e9a47b1603c51593d80801ee9a191a1cb4e8389a126827561261d518c55c"} Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.217238 4937 scope.go:117] "RemoveContainer" containerID="ebd1cf1e87984bc6f9ca2e3042b5fde6ef5b523ce88b3d5ff734e3c731947d5a" Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.232748 4937 scope.go:117] "RemoveContainer" containerID="6a531df34100d25b73192e5f8600ff54920e3ab87c4bab7f0810e56a41e5bded" Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.249221 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nnzrv"] Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.249280 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nnzrv"] Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.268308 4937 scope.go:117] "RemoveContainer" containerID="6504cd85af459f3157ef1f4dcc5ea86d8907fc40a3de7ef5dfe0ea388b0078ad" Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.286739 4937 scope.go:117] "RemoveContainer" containerID="ebd1cf1e87984bc6f9ca2e3042b5fde6ef5b523ce88b3d5ff734e3c731947d5a" Feb 25 16:04:51 crc kubenswrapper[4937]: E0225 16:04:51.287155 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebd1cf1e87984bc6f9ca2e3042b5fde6ef5b523ce88b3d5ff734e3c731947d5a\": container with ID starting with ebd1cf1e87984bc6f9ca2e3042b5fde6ef5b523ce88b3d5ff734e3c731947d5a not found: ID does not exist" containerID="ebd1cf1e87984bc6f9ca2e3042b5fde6ef5b523ce88b3d5ff734e3c731947d5a" Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.287188 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd1cf1e87984bc6f9ca2e3042b5fde6ef5b523ce88b3d5ff734e3c731947d5a"} err="failed to get container status \"ebd1cf1e87984bc6f9ca2e3042b5fde6ef5b523ce88b3d5ff734e3c731947d5a\": rpc error: code = NotFound desc = could not find container \"ebd1cf1e87984bc6f9ca2e3042b5fde6ef5b523ce88b3d5ff734e3c731947d5a\": container with ID starting with ebd1cf1e87984bc6f9ca2e3042b5fde6ef5b523ce88b3d5ff734e3c731947d5a not found: ID does not exist" Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.287213 4937 scope.go:117] "RemoveContainer" containerID="6a531df34100d25b73192e5f8600ff54920e3ab87c4bab7f0810e56a41e5bded" Feb 25 16:04:51 crc kubenswrapper[4937]: E0225 16:04:51.287410 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a531df34100d25b73192e5f8600ff54920e3ab87c4bab7f0810e56a41e5bded\": container with ID starting with 6a531df34100d25b73192e5f8600ff54920e3ab87c4bab7f0810e56a41e5bded not found: ID does not exist" containerID="6a531df34100d25b73192e5f8600ff54920e3ab87c4bab7f0810e56a41e5bded" Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.287435 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a531df34100d25b73192e5f8600ff54920e3ab87c4bab7f0810e56a41e5bded"} err="failed to get container status \"6a531df34100d25b73192e5f8600ff54920e3ab87c4bab7f0810e56a41e5bded\": rpc error: code = NotFound desc = could not find container \"6a531df34100d25b73192e5f8600ff54920e3ab87c4bab7f0810e56a41e5bded\": container with ID starting with 6a531df34100d25b73192e5f8600ff54920e3ab87c4bab7f0810e56a41e5bded not found: ID does not exist" Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.287454 4937 scope.go:117] "RemoveContainer" containerID="6504cd85af459f3157ef1f4dcc5ea86d8907fc40a3de7ef5dfe0ea388b0078ad" Feb 25 16:04:51 crc kubenswrapper[4937]: E0225 16:04:51.287663 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6504cd85af459f3157ef1f4dcc5ea86d8907fc40a3de7ef5dfe0ea388b0078ad\": container with ID starting with 6504cd85af459f3157ef1f4dcc5ea86d8907fc40a3de7ef5dfe0ea388b0078ad not found: ID does not exist" containerID="6504cd85af459f3157ef1f4dcc5ea86d8907fc40a3de7ef5dfe0ea388b0078ad" Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.287687 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6504cd85af459f3157ef1f4dcc5ea86d8907fc40a3de7ef5dfe0ea388b0078ad"} err="failed to get container status \"6504cd85af459f3157ef1f4dcc5ea86d8907fc40a3de7ef5dfe0ea388b0078ad\": rpc error: code = NotFound desc = could not find container \"6504cd85af459f3157ef1f4dcc5ea86d8907fc40a3de7ef5dfe0ea388b0078ad\": container with ID starting with 6504cd85af459f3157ef1f4dcc5ea86d8907fc40a3de7ef5dfe0ea388b0078ad not found: ID does not exist" Feb 25 16:04:51 crc kubenswrapper[4937]: I0225 16:04:51.378747 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11258a71-1d31-4829-8f66-afb999a9611d" path="/var/lib/kubelet/pods/11258a71-1d31-4829-8f66-afb999a9611d/volumes" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.047992 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5z29n"] Feb 25 16:04:53 crc kubenswrapper[4937]: E0225 16:04:53.057830 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11258a71-1d31-4829-8f66-afb999a9611d" containerName="extract-content" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.057849 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="11258a71-1d31-4829-8f66-afb999a9611d" containerName="extract-content" Feb 25 16:04:53 crc kubenswrapper[4937]: E0225 16:04:53.057863 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11258a71-1d31-4829-8f66-afb999a9611d" containerName="registry-server" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.057869 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="11258a71-1d31-4829-8f66-afb999a9611d" containerName="registry-server" Feb 25 16:04:53 crc kubenswrapper[4937]: E0225 16:04:53.057883 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11258a71-1d31-4829-8f66-afb999a9611d" containerName="extract-utilities" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.057889 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="11258a71-1d31-4829-8f66-afb999a9611d" containerName="extract-utilities" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.057993 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="11258a71-1d31-4829-8f66-afb999a9611d" containerName="registry-server" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.058869 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.060347 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z29n"] Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.177547 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-catalog-content\") pod \"redhat-marketplace-5z29n\" (UID: \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\") " pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.177592 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dffxm\" (UniqueName: \"kubernetes.io/projected/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-kube-api-access-dffxm\") pod \"redhat-marketplace-5z29n\" (UID: \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\") " pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.177641 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-utilities\") pod \"redhat-marketplace-5z29n\" (UID: \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\") " pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.230603 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-c4wvk" event={"ID":"086feb17-6360-4d8f-a766-78607300c491","Type":"ContainerStarted","Data":"9a15b75207c3b75228bb7a3efe45858c144c3e57e62c4e9b9a3fbcde6cd23803"} Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.261686 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-c4wvk" podStartSLOduration=1.584840432 podStartE2EDuration="4.261668358s" podCreationTimestamp="2026-02-25 16:04:49 +0000 UTC" firstStartedPulling="2026-02-25 16:04:50.092974073 +0000 UTC m=+1141.106365963" lastFinishedPulling="2026-02-25 16:04:52.769801989 +0000 UTC m=+1143.783193889" observedRunningTime="2026-02-25 16:04:53.261575176 +0000 UTC m=+1144.274967066" watchObservedRunningTime="2026-02-25 16:04:53.261668358 +0000 UTC m=+1144.275060248" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.279120 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-catalog-content\") pod \"redhat-marketplace-5z29n\" (UID: \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\") " pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.279380 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dffxm\" (UniqueName: \"kubernetes.io/projected/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-kube-api-access-dffxm\") pod \"redhat-marketplace-5z29n\" (UID: \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\") " pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.279512 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-utilities\") pod \"redhat-marketplace-5z29n\" (UID: \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\") " pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.279692 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-catalog-content\") pod \"redhat-marketplace-5z29n\" (UID: \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\") " pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.279976 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-utilities\") pod \"redhat-marketplace-5z29n\" (UID: \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\") " pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.313774 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dffxm\" (UniqueName: \"kubernetes.io/projected/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-kube-api-access-dffxm\") pod \"redhat-marketplace-5z29n\" (UID: \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\") " pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.380808 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:04:53 crc kubenswrapper[4937]: I0225 16:04:53.642706 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z29n"] Feb 25 16:04:53 crc kubenswrapper[4937]: W0225 16:04:53.652805 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc03dc593_5e8a_4b48_bf54_b1ea3db41a34.slice/crio-8a0c8da95957e5bf97963e8c5562a4819fe58c69b070803d18ff1033960d8de6 WatchSource:0}: Error finding container 8a0c8da95957e5bf97963e8c5562a4819fe58c69b070803d18ff1033960d8de6: Status 404 returned error can't find the container with id 8a0c8da95957e5bf97963e8c5562a4819fe58c69b070803d18ff1033960d8de6 Feb 25 16:04:54 crc kubenswrapper[4937]: I0225 16:04:54.238715 4937 generic.go:334] "Generic (PLEG): container finished" podID="c03dc593-5e8a-4b48-bf54-b1ea3db41a34" containerID="9243bd48d6e448a96cb37bac349193735b571f84b42ab617972eb80ad11880b5" exitCode=0 Feb 25 16:04:54 crc kubenswrapper[4937]: I0225 16:04:54.238932 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z29n" event={"ID":"c03dc593-5e8a-4b48-bf54-b1ea3db41a34","Type":"ContainerDied","Data":"9243bd48d6e448a96cb37bac349193735b571f84b42ab617972eb80ad11880b5"} Feb 25 16:04:54 crc kubenswrapper[4937]: I0225 16:04:54.239087 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z29n" event={"ID":"c03dc593-5e8a-4b48-bf54-b1ea3db41a34","Type":"ContainerStarted","Data":"8a0c8da95957e5bf97963e8c5562a4819fe58c69b070803d18ff1033960d8de6"} Feb 25 16:04:56 crc kubenswrapper[4937]: I0225 16:04:56.252815 4937 generic.go:334] "Generic (PLEG): container finished" podID="c03dc593-5e8a-4b48-bf54-b1ea3db41a34" containerID="56152e2c8a43224a54198c2c9df95fec2bfeeb20895b56ac1a3173ea644bcb24" exitCode=0 Feb 25 16:04:56 crc kubenswrapper[4937]: I0225 16:04:56.252876 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z29n" event={"ID":"c03dc593-5e8a-4b48-bf54-b1ea3db41a34","Type":"ContainerDied","Data":"56152e2c8a43224a54198c2c9df95fec2bfeeb20895b56ac1a3173ea644bcb24"} Feb 25 16:04:58 crc kubenswrapper[4937]: I0225 16:04:58.275065 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z29n" event={"ID":"c03dc593-5e8a-4b48-bf54-b1ea3db41a34","Type":"ContainerStarted","Data":"281b236928e7c027ba145a717d130dd8ec23f9b91c002c4c6e837bf2349e8058"} Feb 25 16:04:58 crc kubenswrapper[4937]: I0225 16:04:58.300244 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5z29n" podStartSLOduration=2.197353191 podStartE2EDuration="5.300211315s" podCreationTimestamp="2026-02-25 16:04:53 +0000 UTC" firstStartedPulling="2026-02-25 16:04:54.240531986 +0000 UTC m=+1145.253923876" lastFinishedPulling="2026-02-25 16:04:57.3433901 +0000 UTC m=+1148.356782000" observedRunningTime="2026-02-25 16:04:58.299338203 +0000 UTC m=+1149.312730123" watchObservedRunningTime="2026-02-25 16:04:58.300211315 +0000 UTC m=+1149.313603205" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.153083 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-gs4df"] Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.154213 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-gs4df" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.159804 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-xtrcb" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.162377 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj"] Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.163187 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.164341 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5v8v\" (UniqueName: \"kubernetes.io/projected/fee0f5ff-b02d-4a31-921b-e151949932d1-kube-api-access-n5v8v\") pod \"nmstate-webhook-786f45cff4-ljnzj\" (UID: \"fee0f5ff-b02d-4a31-921b-e151949932d1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.164502 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fee0f5ff-b02d-4a31-921b-e151949932d1-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ljnzj\" (UID: \"fee0f5ff-b02d-4a31-921b-e151949932d1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.164739 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmd75\" (UniqueName: \"kubernetes.io/projected/6ce93581-d0da-4acc-978d-4c7b936d736b-kube-api-access-cmd75\") pod \"nmstate-metrics-69594cc75-gs4df\" (UID: \"6ce93581-d0da-4acc-978d-4c7b936d736b\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-gs4df" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.167688 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-gs4df"] Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.174142 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.224120 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj"] Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.229202 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-hb5qm"] Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.230058 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.265512 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3c5b69b1-26a3-4de2-9d56-ffc97c64ddad-dbus-socket\") pod \"nmstate-handler-hb5qm\" (UID: \"3c5b69b1-26a3-4de2-9d56-ffc97c64ddad\") " pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.265785 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmd75\" (UniqueName: \"kubernetes.io/projected/6ce93581-d0da-4acc-978d-4c7b936d736b-kube-api-access-cmd75\") pod \"nmstate-metrics-69594cc75-gs4df\" (UID: \"6ce93581-d0da-4acc-978d-4c7b936d736b\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-gs4df" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.265863 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3c5b69b1-26a3-4de2-9d56-ffc97c64ddad-ovs-socket\") pod \"nmstate-handler-hb5qm\" (UID: \"3c5b69b1-26a3-4de2-9d56-ffc97c64ddad\") " pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.265974 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3c5b69b1-26a3-4de2-9d56-ffc97c64ddad-nmstate-lock\") pod \"nmstate-handler-hb5qm\" (UID: \"3c5b69b1-26a3-4de2-9d56-ffc97c64ddad\") " pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.266092 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5v8v\" (UniqueName: \"kubernetes.io/projected/fee0f5ff-b02d-4a31-921b-e151949932d1-kube-api-access-n5v8v\") pod \"nmstate-webhook-786f45cff4-ljnzj\" (UID: \"fee0f5ff-b02d-4a31-921b-e151949932d1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.266208 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fee0f5ff-b02d-4a31-921b-e151949932d1-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ljnzj\" (UID: \"fee0f5ff-b02d-4a31-921b-e151949932d1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.266383 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfdq2\" (UniqueName: \"kubernetes.io/projected/3c5b69b1-26a3-4de2-9d56-ffc97c64ddad-kube-api-access-vfdq2\") pod \"nmstate-handler-hb5qm\" (UID: \"3c5b69b1-26a3-4de2-9d56-ffc97c64ddad\") " pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:04:59 crc kubenswrapper[4937]: E0225 16:04:59.266337 4937 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 25 16:04:59 crc kubenswrapper[4937]: E0225 16:04:59.266659 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fee0f5ff-b02d-4a31-921b-e151949932d1-tls-key-pair podName:fee0f5ff-b02d-4a31-921b-e151949932d1 nodeName:}" failed. No retries permitted until 2026-02-25 16:04:59.766641871 +0000 UTC m=+1150.780033761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/fee0f5ff-b02d-4a31-921b-e151949932d1-tls-key-pair") pod "nmstate-webhook-786f45cff4-ljnzj" (UID: "fee0f5ff-b02d-4a31-921b-e151949932d1") : secret "openshift-nmstate-webhook" not found Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.303497 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5v8v\" (UniqueName: \"kubernetes.io/projected/fee0f5ff-b02d-4a31-921b-e151949932d1-kube-api-access-n5v8v\") pod \"nmstate-webhook-786f45cff4-ljnzj\" (UID: \"fee0f5ff-b02d-4a31-921b-e151949932d1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.314366 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmd75\" (UniqueName: \"kubernetes.io/projected/6ce93581-d0da-4acc-978d-4c7b936d736b-kube-api-access-cmd75\") pod \"nmstate-metrics-69594cc75-gs4df\" (UID: \"6ce93581-d0da-4acc-978d-4c7b936d736b\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-gs4df" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.369031 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3c5b69b1-26a3-4de2-9d56-ffc97c64ddad-dbus-socket\") pod \"nmstate-handler-hb5qm\" (UID: \"3c5b69b1-26a3-4de2-9d56-ffc97c64ddad\") " pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.369281 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3c5b69b1-26a3-4de2-9d56-ffc97c64ddad-ovs-socket\") pod \"nmstate-handler-hb5qm\" (UID: \"3c5b69b1-26a3-4de2-9d56-ffc97c64ddad\") " pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.369363 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3c5b69b1-26a3-4de2-9d56-ffc97c64ddad-nmstate-lock\") pod \"nmstate-handler-hb5qm\" (UID: \"3c5b69b1-26a3-4de2-9d56-ffc97c64ddad\") " pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.369477 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfdq2\" (UniqueName: \"kubernetes.io/projected/3c5b69b1-26a3-4de2-9d56-ffc97c64ddad-kube-api-access-vfdq2\") pod \"nmstate-handler-hb5qm\" (UID: \"3c5b69b1-26a3-4de2-9d56-ffc97c64ddad\") " pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.369812 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3c5b69b1-26a3-4de2-9d56-ffc97c64ddad-dbus-socket\") pod \"nmstate-handler-hb5qm\" (UID: \"3c5b69b1-26a3-4de2-9d56-ffc97c64ddad\") " pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.369950 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3c5b69b1-26a3-4de2-9d56-ffc97c64ddad-ovs-socket\") pod \"nmstate-handler-hb5qm\" (UID: \"3c5b69b1-26a3-4de2-9d56-ffc97c64ddad\") " pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.370258 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3c5b69b1-26a3-4de2-9d56-ffc97c64ddad-nmstate-lock\") pod \"nmstate-handler-hb5qm\" (UID: \"3c5b69b1-26a3-4de2-9d56-ffc97c64ddad\") " pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.408242 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfdq2\" (UniqueName: \"kubernetes.io/projected/3c5b69b1-26a3-4de2-9d56-ffc97c64ddad-kube-api-access-vfdq2\") pod \"nmstate-handler-hb5qm\" (UID: \"3c5b69b1-26a3-4de2-9d56-ffc97c64ddad\") " pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.425519 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54"] Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.426265 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.428188 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.428662 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-ggnkc" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.429415 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.477330 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54"] Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.522291 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-gs4df" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.556597 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.574233 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9zwh\" (UniqueName: \"kubernetes.io/projected/ac54500d-8e21-4b21-bb07-9ac1daf6ad08-kube-api-access-t9zwh\") pod \"nmstate-console-plugin-5dcbbd79cf-xdc54\" (UID: \"ac54500d-8e21-4b21-bb07-9ac1daf6ad08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.574388 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac54500d-8e21-4b21-bb07-9ac1daf6ad08-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-xdc54\" (UID: \"ac54500d-8e21-4b21-bb07-9ac1daf6ad08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.574423 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ac54500d-8e21-4b21-bb07-9ac1daf6ad08-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-xdc54\" (UID: \"ac54500d-8e21-4b21-bb07-9ac1daf6ad08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54" Feb 25 16:04:59 crc kubenswrapper[4937]: W0225 16:04:59.606446 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c5b69b1_26a3_4de2_9d56_ffc97c64ddad.slice/crio-ebda68bff128127a17efa3e2ad8bf6d52344cda8b59d6e79f9b65d2b8a0e38c2 WatchSource:0}: Error finding container ebda68bff128127a17efa3e2ad8bf6d52344cda8b59d6e79f9b65d2b8a0e38c2: Status 404 returned error can't find the container with id ebda68bff128127a17efa3e2ad8bf6d52344cda8b59d6e79f9b65d2b8a0e38c2 Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.650050 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d8548bc8b-gdgqg"] Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.650760 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.675179 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9zwh\" (UniqueName: \"kubernetes.io/projected/ac54500d-8e21-4b21-bb07-9ac1daf6ad08-kube-api-access-t9zwh\") pod \"nmstate-console-plugin-5dcbbd79cf-xdc54\" (UID: \"ac54500d-8e21-4b21-bb07-9ac1daf6ad08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.675237 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fad364ca-34e5-498d-9bb3-76d1660452f9-service-ca\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.675266 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fad364ca-34e5-498d-9bb3-76d1660452f9-console-oauth-config\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.675299 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fad364ca-34e5-498d-9bb3-76d1660452f9-console-config\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.675323 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fad364ca-34e5-498d-9bb3-76d1660452f9-oauth-serving-cert\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.675414 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fad364ca-34e5-498d-9bb3-76d1660452f9-console-serving-cert\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.675431 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fad364ca-34e5-498d-9bb3-76d1660452f9-trusted-ca-bundle\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.675450 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac54500d-8e21-4b21-bb07-9ac1daf6ad08-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-xdc54\" (UID: \"ac54500d-8e21-4b21-bb07-9ac1daf6ad08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.675557 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g99s9\" (UniqueName: \"kubernetes.io/projected/fad364ca-34e5-498d-9bb3-76d1660452f9-kube-api-access-g99s9\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.675592 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ac54500d-8e21-4b21-bb07-9ac1daf6ad08-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-xdc54\" (UID: \"ac54500d-8e21-4b21-bb07-9ac1daf6ad08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.676761 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ac54500d-8e21-4b21-bb07-9ac1daf6ad08-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-xdc54\" (UID: \"ac54500d-8e21-4b21-bb07-9ac1daf6ad08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.683986 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d8548bc8b-gdgqg"] Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.690867 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac54500d-8e21-4b21-bb07-9ac1daf6ad08-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-xdc54\" (UID: \"ac54500d-8e21-4b21-bb07-9ac1daf6ad08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.699836 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9zwh\" (UniqueName: \"kubernetes.io/projected/ac54500d-8e21-4b21-bb07-9ac1daf6ad08-kube-api-access-t9zwh\") pod \"nmstate-console-plugin-5dcbbd79cf-xdc54\" (UID: \"ac54500d-8e21-4b21-bb07-9ac1daf6ad08\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.751365 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.787686 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g99s9\" (UniqueName: \"kubernetes.io/projected/fad364ca-34e5-498d-9bb3-76d1660452f9-kube-api-access-g99s9\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.787748 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fad364ca-34e5-498d-9bb3-76d1660452f9-service-ca\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.787778 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fad364ca-34e5-498d-9bb3-76d1660452f9-console-oauth-config\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.787807 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fad364ca-34e5-498d-9bb3-76d1660452f9-console-config\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.787829 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fad364ca-34e5-498d-9bb3-76d1660452f9-oauth-serving-cert\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.787893 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fee0f5ff-b02d-4a31-921b-e151949932d1-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ljnzj\" (UID: \"fee0f5ff-b02d-4a31-921b-e151949932d1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.789052 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fad364ca-34e5-498d-9bb3-76d1660452f9-service-ca\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.789013 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fad364ca-34e5-498d-9bb3-76d1660452f9-console-config\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.791908 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fee0f5ff-b02d-4a31-921b-e151949932d1-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ljnzj\" (UID: \"fee0f5ff-b02d-4a31-921b-e151949932d1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.792756 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fad364ca-34e5-498d-9bb3-76d1660452f9-oauth-serving-cert\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.796328 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fad364ca-34e5-498d-9bb3-76d1660452f9-console-serving-cert\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.796381 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fad364ca-34e5-498d-9bb3-76d1660452f9-trusted-ca-bundle\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.804900 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fad364ca-34e5-498d-9bb3-76d1660452f9-trusted-ca-bundle\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.808508 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g99s9\" (UniqueName: \"kubernetes.io/projected/fad364ca-34e5-498d-9bb3-76d1660452f9-kube-api-access-g99s9\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.808746 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fad364ca-34e5-498d-9bb3-76d1660452f9-console-serving-cert\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.810941 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fad364ca-34e5-498d-9bb3-76d1660452f9-console-oauth-config\") pod \"console-6d8548bc8b-gdgqg\" (UID: \"fad364ca-34e5-498d-9bb3-76d1660452f9\") " pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.824184 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-gs4df"] Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.831022 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj" Feb 25 16:04:59 crc kubenswrapper[4937]: W0225 16:04:59.834797 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ce93581_d0da_4acc_978d_4c7b936d736b.slice/crio-d627c9552e9b279dd5873320357f7beee611c3583cf80806faead39862c55c0d WatchSource:0}: Error finding container d627c9552e9b279dd5873320357f7beee611c3583cf80806faead39862c55c0d: Status 404 returned error can't find the container with id d627c9552e9b279dd5873320357f7beee611c3583cf80806faead39862c55c0d Feb 25 16:04:59 crc kubenswrapper[4937]: I0225 16:04:59.988660 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:05:00 crc kubenswrapper[4937]: I0225 16:05:00.069307 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj"] Feb 25 16:05:00 crc kubenswrapper[4937]: I0225 16:05:00.172899 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54"] Feb 25 16:05:00 crc kubenswrapper[4937]: W0225 16:05:00.182843 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac54500d_8e21_4b21_bb07_9ac1daf6ad08.slice/crio-97e57adc9aa2272831ffa76f51fb0f1acacad35e920316b451d9f7328604824e WatchSource:0}: Error finding container 97e57adc9aa2272831ffa76f51fb0f1acacad35e920316b451d9f7328604824e: Status 404 returned error can't find the container with id 97e57adc9aa2272831ffa76f51fb0f1acacad35e920316b451d9f7328604824e Feb 25 16:05:00 crc kubenswrapper[4937]: I0225 16:05:00.208272 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d8548bc8b-gdgqg"] Feb 25 16:05:00 crc kubenswrapper[4937]: W0225 16:05:00.217240 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad364ca_34e5_498d_9bb3_76d1660452f9.slice/crio-d4b3346f10d1f71f9db4b7d42c6c3c70a30a08a3e0171cafd86e77586cbc7d7a WatchSource:0}: Error finding container d4b3346f10d1f71f9db4b7d42c6c3c70a30a08a3e0171cafd86e77586cbc7d7a: Status 404 returned error can't find the container with id d4b3346f10d1f71f9db4b7d42c6c3c70a30a08a3e0171cafd86e77586cbc7d7a Feb 25 16:05:00 crc kubenswrapper[4937]: I0225 16:05:00.290246 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj" event={"ID":"fee0f5ff-b02d-4a31-921b-e151949932d1","Type":"ContainerStarted","Data":"a469087ae00aacbca6ad781cc866034f5f3247af7688764adc65b255c8f4c614"} Feb 25 16:05:00 crc kubenswrapper[4937]: I0225 16:05:00.291067 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-gs4df" event={"ID":"6ce93581-d0da-4acc-978d-4c7b936d736b","Type":"ContainerStarted","Data":"d627c9552e9b279dd5873320357f7beee611c3583cf80806faead39862c55c0d"} Feb 25 16:05:00 crc kubenswrapper[4937]: I0225 16:05:00.292294 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d8548bc8b-gdgqg" event={"ID":"fad364ca-34e5-498d-9bb3-76d1660452f9","Type":"ContainerStarted","Data":"d4b3346f10d1f71f9db4b7d42c6c3c70a30a08a3e0171cafd86e77586cbc7d7a"} Feb 25 16:05:00 crc kubenswrapper[4937]: I0225 16:05:00.293074 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54" event={"ID":"ac54500d-8e21-4b21-bb07-9ac1daf6ad08","Type":"ContainerStarted","Data":"97e57adc9aa2272831ffa76f51fb0f1acacad35e920316b451d9f7328604824e"} Feb 25 16:05:00 crc kubenswrapper[4937]: I0225 16:05:00.293787 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hb5qm" event={"ID":"3c5b69b1-26a3-4de2-9d56-ffc97c64ddad","Type":"ContainerStarted","Data":"ebda68bff128127a17efa3e2ad8bf6d52344cda8b59d6e79f9b65d2b8a0e38c2"} Feb 25 16:05:01 crc kubenswrapper[4937]: I0225 16:05:01.310063 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d8548bc8b-gdgqg" event={"ID":"fad364ca-34e5-498d-9bb3-76d1660452f9","Type":"ContainerStarted","Data":"027124d4b2913f3a8d7543adb4d414a88aed17b42621c4c0bd9439f4a87b4463"} Feb 25 16:05:01 crc kubenswrapper[4937]: I0225 16:05:01.345462 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d8548bc8b-gdgqg" podStartSLOduration=2.3454470130000002 podStartE2EDuration="2.345447013s" podCreationTimestamp="2026-02-25 16:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:05:01.339881363 +0000 UTC m=+1152.353273263" watchObservedRunningTime="2026-02-25 16:05:01.345447013 +0000 UTC m=+1152.358838903" Feb 25 16:05:03 crc kubenswrapper[4937]: I0225 16:05:03.381419 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:05:03 crc kubenswrapper[4937]: I0225 16:05:03.381819 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:05:03 crc kubenswrapper[4937]: I0225 16:05:03.436084 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:05:04 crc kubenswrapper[4937]: I0225 16:05:04.343088 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-gs4df" event={"ID":"6ce93581-d0da-4acc-978d-4c7b936d736b","Type":"ContainerStarted","Data":"944c4f5e28e272d6eed53fe7e4a44913a610af6959cf70548a0abba39f286ddb"} Feb 25 16:05:04 crc kubenswrapper[4937]: I0225 16:05:04.344550 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54" event={"ID":"ac54500d-8e21-4b21-bb07-9ac1daf6ad08","Type":"ContainerStarted","Data":"75d4efb5c4339837f11bafaaa74b3099213500c5e37124da8267c4f8a1897105"} Feb 25 16:05:04 crc kubenswrapper[4937]: I0225 16:05:04.347825 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hb5qm" event={"ID":"3c5b69b1-26a3-4de2-9d56-ffc97c64ddad","Type":"ContainerStarted","Data":"8aa3b13f7f24ad9c83c40270028dcba9512a4c5c7fdc02859212818b2b7e89ee"} Feb 25 16:05:04 crc kubenswrapper[4937]: I0225 16:05:04.348109 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:05:04 crc kubenswrapper[4937]: I0225 16:05:04.353769 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj" event={"ID":"fee0f5ff-b02d-4a31-921b-e151949932d1","Type":"ContainerStarted","Data":"f7f4abb7941102e971966c9ffe96e2b9a5a3cbfb38e1f01ad3e1c01f13a82f0f"} Feb 25 16:05:04 crc kubenswrapper[4937]: I0225 16:05:04.373847 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-xdc54" podStartSLOduration=2.239342491 podStartE2EDuration="5.373826828s" podCreationTimestamp="2026-02-25 16:04:59 +0000 UTC" firstStartedPulling="2026-02-25 16:05:00.186281903 +0000 UTC m=+1151.199673793" lastFinishedPulling="2026-02-25 16:05:03.32076623 +0000 UTC m=+1154.334158130" observedRunningTime="2026-02-25 16:05:04.369072369 +0000 UTC m=+1155.382464259" watchObservedRunningTime="2026-02-25 16:05:04.373826828 +0000 UTC m=+1155.387218718" Feb 25 16:05:04 crc kubenswrapper[4937]: I0225 16:05:04.387461 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-hb5qm" podStartSLOduration=1.6605616410000001 podStartE2EDuration="5.38744562s" podCreationTimestamp="2026-02-25 16:04:59 +0000 UTC" firstStartedPulling="2026-02-25 16:04:59.611948074 +0000 UTC m=+1150.625339954" lastFinishedPulling="2026-02-25 16:05:03.338832003 +0000 UTC m=+1154.352223933" observedRunningTime="2026-02-25 16:05:04.386819234 +0000 UTC m=+1155.400211124" watchObservedRunningTime="2026-02-25 16:05:04.38744562 +0000 UTC m=+1155.400837510" Feb 25 16:05:04 crc kubenswrapper[4937]: I0225 16:05:04.406965 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj" podStartSLOduration=2.130443999 podStartE2EDuration="5.406946769s" podCreationTimestamp="2026-02-25 16:04:59 +0000 UTC" firstStartedPulling="2026-02-25 16:05:00.089222738 +0000 UTC m=+1151.102614628" lastFinishedPulling="2026-02-25 16:05:03.365725508 +0000 UTC m=+1154.379117398" observedRunningTime="2026-02-25 16:05:04.401941643 +0000 UTC m=+1155.415333533" watchObservedRunningTime="2026-02-25 16:05:04.406946769 +0000 UTC m=+1155.420338659" Feb 25 16:05:04 crc kubenswrapper[4937]: I0225 16:05:04.407536 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:05:04 crc kubenswrapper[4937]: I0225 16:05:04.451408 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z29n"] Feb 25 16:05:05 crc kubenswrapper[4937]: I0225 16:05:05.362088 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj" Feb 25 16:05:06 crc kubenswrapper[4937]: I0225 16:05:06.367684 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5z29n" podUID="c03dc593-5e8a-4b48-bf54-b1ea3db41a34" containerName="registry-server" containerID="cri-o://281b236928e7c027ba145a717d130dd8ec23f9b91c002c4c6e837bf2349e8058" gracePeriod=2 Feb 25 16:05:06 crc kubenswrapper[4937]: I0225 16:05:06.800911 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:05:06 crc kubenswrapper[4937]: I0225 16:05:06.909474 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-catalog-content\") pod \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\" (UID: \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\") " Feb 25 16:05:06 crc kubenswrapper[4937]: I0225 16:05:06.909696 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-utilities\") pod \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\" (UID: \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\") " Feb 25 16:05:06 crc kubenswrapper[4937]: I0225 16:05:06.909818 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dffxm\" (UniqueName: \"kubernetes.io/projected/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-kube-api-access-dffxm\") pod \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\" (UID: \"c03dc593-5e8a-4b48-bf54-b1ea3db41a34\") " Feb 25 16:05:06 crc kubenswrapper[4937]: I0225 16:05:06.910764 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-utilities" (OuterVolumeSpecName: "utilities") pod "c03dc593-5e8a-4b48-bf54-b1ea3db41a34" (UID: "c03dc593-5e8a-4b48-bf54-b1ea3db41a34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:05:06 crc kubenswrapper[4937]: I0225 16:05:06.916396 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-kube-api-access-dffxm" (OuterVolumeSpecName: "kube-api-access-dffxm") pod "c03dc593-5e8a-4b48-bf54-b1ea3db41a34" (UID: "c03dc593-5e8a-4b48-bf54-b1ea3db41a34"). InnerVolumeSpecName "kube-api-access-dffxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:05:06 crc kubenswrapper[4937]: I0225 16:05:06.939136 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c03dc593-5e8a-4b48-bf54-b1ea3db41a34" (UID: "c03dc593-5e8a-4b48-bf54-b1ea3db41a34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.011674 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.011737 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dffxm\" (UniqueName: \"kubernetes.io/projected/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-kube-api-access-dffxm\") on node \"crc\" DevicePath \"\"" Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.011754 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c03dc593-5e8a-4b48-bf54-b1ea3db41a34-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.392344 4937 generic.go:334] "Generic (PLEG): container finished" podID="c03dc593-5e8a-4b48-bf54-b1ea3db41a34" containerID="281b236928e7c027ba145a717d130dd8ec23f9b91c002c4c6e837bf2349e8058" exitCode=0 Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.392407 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z29n" event={"ID":"c03dc593-5e8a-4b48-bf54-b1ea3db41a34","Type":"ContainerDied","Data":"281b236928e7c027ba145a717d130dd8ec23f9b91c002c4c6e837bf2349e8058"} Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.392458 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z29n" event={"ID":"c03dc593-5e8a-4b48-bf54-b1ea3db41a34","Type":"ContainerDied","Data":"8a0c8da95957e5bf97963e8c5562a4819fe58c69b070803d18ff1033960d8de6"} Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.392473 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z29n" Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.392523 4937 scope.go:117] "RemoveContainer" containerID="281b236928e7c027ba145a717d130dd8ec23f9b91c002c4c6e837bf2349e8058" Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.421475 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z29n"] Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.427089 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z29n"] Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.529662 4937 scope.go:117] "RemoveContainer" containerID="56152e2c8a43224a54198c2c9df95fec2bfeeb20895b56ac1a3173ea644bcb24" Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.543072 4937 scope.go:117] "RemoveContainer" containerID="9243bd48d6e448a96cb37bac349193735b571f84b42ab617972eb80ad11880b5" Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.613376 4937 scope.go:117] "RemoveContainer" containerID="281b236928e7c027ba145a717d130dd8ec23f9b91c002c4c6e837bf2349e8058" Feb 25 16:05:07 crc kubenswrapper[4937]: E0225 16:05:07.613740 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281b236928e7c027ba145a717d130dd8ec23f9b91c002c4c6e837bf2349e8058\": container with ID starting with 281b236928e7c027ba145a717d130dd8ec23f9b91c002c4c6e837bf2349e8058 not found: ID does not exist" containerID="281b236928e7c027ba145a717d130dd8ec23f9b91c002c4c6e837bf2349e8058" Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.613778 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281b236928e7c027ba145a717d130dd8ec23f9b91c002c4c6e837bf2349e8058"} err="failed to get container status \"281b236928e7c027ba145a717d130dd8ec23f9b91c002c4c6e837bf2349e8058\": rpc error: code = NotFound desc = could not find container \"281b236928e7c027ba145a717d130dd8ec23f9b91c002c4c6e837bf2349e8058\": container with ID starting with 281b236928e7c027ba145a717d130dd8ec23f9b91c002c4c6e837bf2349e8058 not found: ID does not exist" Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.613803 4937 scope.go:117] "RemoveContainer" containerID="56152e2c8a43224a54198c2c9df95fec2bfeeb20895b56ac1a3173ea644bcb24" Feb 25 16:05:07 crc kubenswrapper[4937]: E0225 16:05:07.614160 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56152e2c8a43224a54198c2c9df95fec2bfeeb20895b56ac1a3173ea644bcb24\": container with ID starting with 56152e2c8a43224a54198c2c9df95fec2bfeeb20895b56ac1a3173ea644bcb24 not found: ID does not exist" containerID="56152e2c8a43224a54198c2c9df95fec2bfeeb20895b56ac1a3173ea644bcb24" Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.614190 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56152e2c8a43224a54198c2c9df95fec2bfeeb20895b56ac1a3173ea644bcb24"} err="failed to get container status \"56152e2c8a43224a54198c2c9df95fec2bfeeb20895b56ac1a3173ea644bcb24\": rpc error: code = NotFound desc = could not find container \"56152e2c8a43224a54198c2c9df95fec2bfeeb20895b56ac1a3173ea644bcb24\": container with ID starting with 56152e2c8a43224a54198c2c9df95fec2bfeeb20895b56ac1a3173ea644bcb24 not found: ID does not exist" Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.614209 4937 scope.go:117] "RemoveContainer" containerID="9243bd48d6e448a96cb37bac349193735b571f84b42ab617972eb80ad11880b5" Feb 25 16:05:07 crc kubenswrapper[4937]: E0225 16:05:07.614533 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9243bd48d6e448a96cb37bac349193735b571f84b42ab617972eb80ad11880b5\": container with ID starting with 9243bd48d6e448a96cb37bac349193735b571f84b42ab617972eb80ad11880b5 not found: ID does not exist" containerID="9243bd48d6e448a96cb37bac349193735b571f84b42ab617972eb80ad11880b5" Feb 25 16:05:07 crc kubenswrapper[4937]: I0225 16:05:07.614577 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9243bd48d6e448a96cb37bac349193735b571f84b42ab617972eb80ad11880b5"} err="failed to get container status \"9243bd48d6e448a96cb37bac349193735b571f84b42ab617972eb80ad11880b5\": rpc error: code = NotFound desc = could not find container \"9243bd48d6e448a96cb37bac349193735b571f84b42ab617972eb80ad11880b5\": container with ID starting with 9243bd48d6e448a96cb37bac349193735b571f84b42ab617972eb80ad11880b5 not found: ID does not exist" Feb 25 16:05:08 crc kubenswrapper[4937]: I0225 16:05:08.401223 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-gs4df" event={"ID":"6ce93581-d0da-4acc-978d-4c7b936d736b","Type":"ContainerStarted","Data":"94836baa0206e6c36f7b5c3e77ef6d76605e4baa3392412ceb8e9dd7bafd9f22"} Feb 25 16:05:08 crc kubenswrapper[4937]: I0225 16:05:08.414789 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-gs4df" podStartSLOduration=1.647059923 podStartE2EDuration="9.414756186s" podCreationTimestamp="2026-02-25 16:04:59 +0000 UTC" firstStartedPulling="2026-02-25 16:04:59.847872323 +0000 UTC m=+1150.861264213" lastFinishedPulling="2026-02-25 16:05:07.615568586 +0000 UTC m=+1158.628960476" observedRunningTime="2026-02-25 16:05:08.414242733 +0000 UTC m=+1159.427634623" watchObservedRunningTime="2026-02-25 16:05:08.414756186 +0000 UTC m=+1159.428148076" Feb 25 16:05:09 crc kubenswrapper[4937]: I0225 16:05:09.375566 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03dc593-5e8a-4b48-bf54-b1ea3db41a34" path="/var/lib/kubelet/pods/c03dc593-5e8a-4b48-bf54-b1ea3db41a34/volumes" Feb 25 16:05:09 crc kubenswrapper[4937]: I0225 16:05:09.580877 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-hb5qm" Feb 25 16:05:09 crc kubenswrapper[4937]: I0225 16:05:09.989363 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:05:09 crc kubenswrapper[4937]: I0225 16:05:09.989813 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:05:09 crc kubenswrapper[4937]: I0225 16:05:09.995865 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:05:10 crc kubenswrapper[4937]: I0225 16:05:10.418115 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d8548bc8b-gdgqg" Feb 25 16:05:10 crc kubenswrapper[4937]: I0225 16:05:10.471380 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-djs85"] Feb 25 16:05:19 crc kubenswrapper[4937]: I0225 16:05:19.836476 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ljnzj" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.133206 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw"] Feb 25 16:05:35 crc kubenswrapper[4937]: E0225 16:05:35.134124 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03dc593-5e8a-4b48-bf54-b1ea3db41a34" containerName="extract-content" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.134141 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03dc593-5e8a-4b48-bf54-b1ea3db41a34" containerName="extract-content" Feb 25 16:05:35 crc kubenswrapper[4937]: E0225 16:05:35.134161 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03dc593-5e8a-4b48-bf54-b1ea3db41a34" containerName="extract-utilities" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.134168 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03dc593-5e8a-4b48-bf54-b1ea3db41a34" containerName="extract-utilities" Feb 25 16:05:35 crc kubenswrapper[4937]: E0225 16:05:35.134185 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03dc593-5e8a-4b48-bf54-b1ea3db41a34" containerName="registry-server" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.134191 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03dc593-5e8a-4b48-bf54-b1ea3db41a34" containerName="registry-server" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.134325 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03dc593-5e8a-4b48-bf54-b1ea3db41a34" containerName="registry-server" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.135209 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.138290 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.144925 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw"] Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.230777 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1d8e5a9-c042-4057-bda5-874d8f7fc926-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw\" (UID: \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.230850 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1d8e5a9-c042-4057-bda5-874d8f7fc926-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw\" (UID: \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.230910 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67stj\" (UniqueName: \"kubernetes.io/projected/c1d8e5a9-c042-4057-bda5-874d8f7fc926-kube-api-access-67stj\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw\" (UID: \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.332450 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1d8e5a9-c042-4057-bda5-874d8f7fc926-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw\" (UID: \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.332551 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67stj\" (UniqueName: \"kubernetes.io/projected/c1d8e5a9-c042-4057-bda5-874d8f7fc926-kube-api-access-67stj\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw\" (UID: \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.332692 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1d8e5a9-c042-4057-bda5-874d8f7fc926-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw\" (UID: \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.333310 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1d8e5a9-c042-4057-bda5-874d8f7fc926-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw\" (UID: \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.333306 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1d8e5a9-c042-4057-bda5-874d8f7fc926-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw\" (UID: \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.363592 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67stj\" (UniqueName: \"kubernetes.io/projected/c1d8e5a9-c042-4057-bda5-874d8f7fc926-kube-api-access-67stj\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw\" (UID: \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.458226 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.532110 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-djs85" podUID="ff089f24-3d05-4c97-b6f7-3a39cbec049f" containerName="console" containerID="cri-o://78701e0723c954f51d37416168268396db1d90ff2146a8f0702433550f339d45" gracePeriod=15 Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.831606 4937 patch_prober.go:28] interesting pod/console-f9d7485db-djs85 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.832177 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-djs85" podUID="ff089f24-3d05-4c97-b6f7-3a39cbec049f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 25 16:05:35 crc kubenswrapper[4937]: I0225 16:05:35.902614 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw"] Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.034961 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-djs85_ff089f24-3d05-4c97-b6f7-3a39cbec049f/console/0.log" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.035074 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-djs85" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.141115 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-serving-cert\") pod \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.141160 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-config\") pod \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.141232 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-trusted-ca-bundle\") pod \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.141276 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-service-ca\") pod \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.141298 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-oauth-serving-cert\") pod \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.141360 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-786cs\" (UniqueName: \"kubernetes.io/projected/ff089f24-3d05-4c97-b6f7-3a39cbec049f-kube-api-access-786cs\") pod \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.141388 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-oauth-config\") pod \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\" (UID: \"ff089f24-3d05-4c97-b6f7-3a39cbec049f\") " Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.142177 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ff089f24-3d05-4c97-b6f7-3a39cbec049f" (UID: "ff089f24-3d05-4c97-b6f7-3a39cbec049f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.142191 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ff089f24-3d05-4c97-b6f7-3a39cbec049f" (UID: "ff089f24-3d05-4c97-b6f7-3a39cbec049f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.142213 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-service-ca" (OuterVolumeSpecName: "service-ca") pod "ff089f24-3d05-4c97-b6f7-3a39cbec049f" (UID: "ff089f24-3d05-4c97-b6f7-3a39cbec049f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.142802 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-config" (OuterVolumeSpecName: "console-config") pod "ff089f24-3d05-4c97-b6f7-3a39cbec049f" (UID: "ff089f24-3d05-4c97-b6f7-3a39cbec049f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.146880 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ff089f24-3d05-4c97-b6f7-3a39cbec049f" (UID: "ff089f24-3d05-4c97-b6f7-3a39cbec049f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.146901 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff089f24-3d05-4c97-b6f7-3a39cbec049f-kube-api-access-786cs" (OuterVolumeSpecName: "kube-api-access-786cs") pod "ff089f24-3d05-4c97-b6f7-3a39cbec049f" (UID: "ff089f24-3d05-4c97-b6f7-3a39cbec049f"). InnerVolumeSpecName "kube-api-access-786cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.148252 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ff089f24-3d05-4c97-b6f7-3a39cbec049f" (UID: "ff089f24-3d05-4c97-b6f7-3a39cbec049f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.242340 4937 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.242371 4937 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.242380 4937 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.242391 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-786cs\" (UniqueName: \"kubernetes.io/projected/ff089f24-3d05-4c97-b6f7-3a39cbec049f-kube-api-access-786cs\") on node \"crc\" DevicePath \"\"" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.242400 4937 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.242409 4937 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.242416 4937 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff089f24-3d05-4c97-b6f7-3a39cbec049f-console-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.606381 4937 generic.go:334] "Generic (PLEG): container finished" podID="c1d8e5a9-c042-4057-bda5-874d8f7fc926" containerID="51b55b2a9116c7e72584df664786c6948e67050c575f9ffdcd76b84a75ec1128" exitCode=0 Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.606430 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" event={"ID":"c1d8e5a9-c042-4057-bda5-874d8f7fc926","Type":"ContainerDied","Data":"51b55b2a9116c7e72584df664786c6948e67050c575f9ffdcd76b84a75ec1128"} Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.606522 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" event={"ID":"c1d8e5a9-c042-4057-bda5-874d8f7fc926","Type":"ContainerStarted","Data":"3fd9a0ccc51b28d22929eecd2275bd2ceea8a55cf6b54c325d73cef9266637b8"} Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.615458 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-djs85_ff089f24-3d05-4c97-b6f7-3a39cbec049f/console/0.log" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.615550 4937 generic.go:334] "Generic (PLEG): container finished" podID="ff089f24-3d05-4c97-b6f7-3a39cbec049f" containerID="78701e0723c954f51d37416168268396db1d90ff2146a8f0702433550f339d45" exitCode=2 Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.615592 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-djs85" event={"ID":"ff089f24-3d05-4c97-b6f7-3a39cbec049f","Type":"ContainerDied","Data":"78701e0723c954f51d37416168268396db1d90ff2146a8f0702433550f339d45"} Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.615626 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-djs85" event={"ID":"ff089f24-3d05-4c97-b6f7-3a39cbec049f","Type":"ContainerDied","Data":"85ad2c9be1c0d93692e33a0f01838d85a804c6ecfc1da4810c3df73376f3da2c"} Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.615652 4937 scope.go:117] "RemoveContainer" containerID="78701e0723c954f51d37416168268396db1d90ff2146a8f0702433550f339d45" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.615820 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-djs85" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.653661 4937 scope.go:117] "RemoveContainer" containerID="78701e0723c954f51d37416168268396db1d90ff2146a8f0702433550f339d45" Feb 25 16:05:36 crc kubenswrapper[4937]: E0225 16:05:36.654066 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78701e0723c954f51d37416168268396db1d90ff2146a8f0702433550f339d45\": container with ID starting with 78701e0723c954f51d37416168268396db1d90ff2146a8f0702433550f339d45 not found: ID does not exist" containerID="78701e0723c954f51d37416168268396db1d90ff2146a8f0702433550f339d45" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.654098 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78701e0723c954f51d37416168268396db1d90ff2146a8f0702433550f339d45"} err="failed to get container status \"78701e0723c954f51d37416168268396db1d90ff2146a8f0702433550f339d45\": rpc error: code = NotFound desc = could not find container \"78701e0723c954f51d37416168268396db1d90ff2146a8f0702433550f339d45\": container with ID starting with 78701e0723c954f51d37416168268396db1d90ff2146a8f0702433550f339d45 not found: ID does not exist" Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.658883 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-djs85"] Feb 25 16:05:36 crc kubenswrapper[4937]: I0225 16:05:36.665081 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-djs85"] Feb 25 16:05:37 crc kubenswrapper[4937]: I0225 16:05:37.377687 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff089f24-3d05-4c97-b6f7-3a39cbec049f" path="/var/lib/kubelet/pods/ff089f24-3d05-4c97-b6f7-3a39cbec049f/volumes" Feb 25 16:05:39 crc kubenswrapper[4937]: I0225 16:05:39.646543 4937 generic.go:334] "Generic (PLEG): container finished" podID="c1d8e5a9-c042-4057-bda5-874d8f7fc926" containerID="00cac81ffe67c6234637aaf9cd8319ce4e67f14c817a35a2de9b1eb3c92cc216" exitCode=0 Feb 25 16:05:39 crc kubenswrapper[4937]: I0225 16:05:39.646671 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" event={"ID":"c1d8e5a9-c042-4057-bda5-874d8f7fc926","Type":"ContainerDied","Data":"00cac81ffe67c6234637aaf9cd8319ce4e67f14c817a35a2de9b1eb3c92cc216"} Feb 25 16:05:40 crc kubenswrapper[4937]: I0225 16:05:40.656384 4937 generic.go:334] "Generic (PLEG): container finished" podID="c1d8e5a9-c042-4057-bda5-874d8f7fc926" containerID="1a763ec8bb8b73879c3f95e3c61b89890065de7ed6418bf37ec2db7777944317" exitCode=0 Feb 25 16:05:40 crc kubenswrapper[4937]: I0225 16:05:40.656463 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" event={"ID":"c1d8e5a9-c042-4057-bda5-874d8f7fc926","Type":"ContainerDied","Data":"1a763ec8bb8b73879c3f95e3c61b89890065de7ed6418bf37ec2db7777944317"} Feb 25 16:05:41 crc kubenswrapper[4937]: I0225 16:05:41.937094 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" Feb 25 16:05:42 crc kubenswrapper[4937]: I0225 16:05:42.120674 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67stj\" (UniqueName: \"kubernetes.io/projected/c1d8e5a9-c042-4057-bda5-874d8f7fc926-kube-api-access-67stj\") pod \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\" (UID: \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\") " Feb 25 16:05:42 crc kubenswrapper[4937]: I0225 16:05:42.120737 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1d8e5a9-c042-4057-bda5-874d8f7fc926-bundle\") pod \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\" (UID: \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\") " Feb 25 16:05:42 crc kubenswrapper[4937]: I0225 16:05:42.120855 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1d8e5a9-c042-4057-bda5-874d8f7fc926-util\") pod \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\" (UID: \"c1d8e5a9-c042-4057-bda5-874d8f7fc926\") " Feb 25 16:05:42 crc kubenswrapper[4937]: I0225 16:05:42.122310 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1d8e5a9-c042-4057-bda5-874d8f7fc926-bundle" (OuterVolumeSpecName: "bundle") pod "c1d8e5a9-c042-4057-bda5-874d8f7fc926" (UID: "c1d8e5a9-c042-4057-bda5-874d8f7fc926"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:05:42 crc kubenswrapper[4937]: I0225 16:05:42.126925 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1d8e5a9-c042-4057-bda5-874d8f7fc926-kube-api-access-67stj" (OuterVolumeSpecName: "kube-api-access-67stj") pod "c1d8e5a9-c042-4057-bda5-874d8f7fc926" (UID: "c1d8e5a9-c042-4057-bda5-874d8f7fc926"). InnerVolumeSpecName "kube-api-access-67stj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:05:42 crc kubenswrapper[4937]: I0225 16:05:42.136647 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1d8e5a9-c042-4057-bda5-874d8f7fc926-util" (OuterVolumeSpecName: "util") pod "c1d8e5a9-c042-4057-bda5-874d8f7fc926" (UID: "c1d8e5a9-c042-4057-bda5-874d8f7fc926"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:05:42 crc kubenswrapper[4937]: I0225 16:05:42.222640 4937 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1d8e5a9-c042-4057-bda5-874d8f7fc926-util\") on node \"crc\" DevicePath \"\"" Feb 25 16:05:42 crc kubenswrapper[4937]: I0225 16:05:42.222730 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67stj\" (UniqueName: \"kubernetes.io/projected/c1d8e5a9-c042-4057-bda5-874d8f7fc926-kube-api-access-67stj\") on node \"crc\" DevicePath \"\"" Feb 25 16:05:42 crc kubenswrapper[4937]: I0225 16:05:42.222748 4937 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1d8e5a9-c042-4057-bda5-874d8f7fc926-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:05:42 crc kubenswrapper[4937]: I0225 16:05:42.674704 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" event={"ID":"c1d8e5a9-c042-4057-bda5-874d8f7fc926","Type":"ContainerDied","Data":"3fd9a0ccc51b28d22929eecd2275bd2ceea8a55cf6b54c325d73cef9266637b8"} Feb 25 16:05:42 crc kubenswrapper[4937]: I0225 16:05:42.674765 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fd9a0ccc51b28d22929eecd2275bd2ceea8a55cf6b54c325d73cef9266637b8" Feb 25 16:05:42 crc kubenswrapper[4937]: I0225 16:05:42.674779 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.779020 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-678f5df958-zlttq"] Feb 25 16:05:52 crc kubenswrapper[4937]: E0225 16:05:52.781233 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d8e5a9-c042-4057-bda5-874d8f7fc926" containerName="pull" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.781372 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d8e5a9-c042-4057-bda5-874d8f7fc926" containerName="pull" Feb 25 16:05:52 crc kubenswrapper[4937]: E0225 16:05:52.781504 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d8e5a9-c042-4057-bda5-874d8f7fc926" containerName="extract" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.781623 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d8e5a9-c042-4057-bda5-874d8f7fc926" containerName="extract" Feb 25 16:05:52 crc kubenswrapper[4937]: E0225 16:05:52.781767 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d8e5a9-c042-4057-bda5-874d8f7fc926" containerName="util" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.781905 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d8e5a9-c042-4057-bda5-874d8f7fc926" containerName="util" Feb 25 16:05:52 crc kubenswrapper[4937]: E0225 16:05:52.782055 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff089f24-3d05-4c97-b6f7-3a39cbec049f" containerName="console" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.782194 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff089f24-3d05-4c97-b6f7-3a39cbec049f" containerName="console" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.782477 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff089f24-3d05-4c97-b6f7-3a39cbec049f" containerName="console" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.782632 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1d8e5a9-c042-4057-bda5-874d8f7fc926" containerName="extract" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.783607 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.787659 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.787737 4937 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.788099 4937 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.788104 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.788292 4937 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-rfp9p" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.796374 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-678f5df958-zlttq"] Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.911865 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ad5751a-e32c-4f13-ab06-b3ddeb681961-apiservice-cert\") pod \"metallb-operator-controller-manager-678f5df958-zlttq\" (UID: \"8ad5751a-e32c-4f13-ab06-b3ddeb681961\") " pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.911932 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmr8l\" (UniqueName: \"kubernetes.io/projected/8ad5751a-e32c-4f13-ab06-b3ddeb681961-kube-api-access-fmr8l\") pod \"metallb-operator-controller-manager-678f5df958-zlttq\" (UID: \"8ad5751a-e32c-4f13-ab06-b3ddeb681961\") " pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" Feb 25 16:05:52 crc kubenswrapper[4937]: I0225 16:05:52.911977 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ad5751a-e32c-4f13-ab06-b3ddeb681961-webhook-cert\") pod \"metallb-operator-controller-manager-678f5df958-zlttq\" (UID: \"8ad5751a-e32c-4f13-ab06-b3ddeb681961\") " pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.013541 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ad5751a-e32c-4f13-ab06-b3ddeb681961-apiservice-cert\") pod \"metallb-operator-controller-manager-678f5df958-zlttq\" (UID: \"8ad5751a-e32c-4f13-ab06-b3ddeb681961\") " pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.013613 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmr8l\" (UniqueName: \"kubernetes.io/projected/8ad5751a-e32c-4f13-ab06-b3ddeb681961-kube-api-access-fmr8l\") pod \"metallb-operator-controller-manager-678f5df958-zlttq\" (UID: \"8ad5751a-e32c-4f13-ab06-b3ddeb681961\") " pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.013654 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ad5751a-e32c-4f13-ab06-b3ddeb681961-webhook-cert\") pod \"metallb-operator-controller-manager-678f5df958-zlttq\" (UID: \"8ad5751a-e32c-4f13-ab06-b3ddeb681961\") " pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.014575 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-59546f7477-2w52w"] Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.015943 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.019883 4937 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-btrxq" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.020105 4937 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.022052 4937 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.022309 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ad5751a-e32c-4f13-ab06-b3ddeb681961-webhook-cert\") pod \"metallb-operator-controller-manager-678f5df958-zlttq\" (UID: \"8ad5751a-e32c-4f13-ab06-b3ddeb681961\") " pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.023565 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ad5751a-e32c-4f13-ab06-b3ddeb681961-apiservice-cert\") pod \"metallb-operator-controller-manager-678f5df958-zlttq\" (UID: \"8ad5751a-e32c-4f13-ab06-b3ddeb681961\") " pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.031416 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-59546f7477-2w52w"] Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.038811 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmr8l\" (UniqueName: \"kubernetes.io/projected/8ad5751a-e32c-4f13-ab06-b3ddeb681961-kube-api-access-fmr8l\") pod \"metallb-operator-controller-manager-678f5df958-zlttq\" (UID: \"8ad5751a-e32c-4f13-ab06-b3ddeb681961\") " pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.096654 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.115364 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5mfw\" (UniqueName: \"kubernetes.io/projected/6faaefa8-4269-448f-90a9-b4af7b5b2eae-kube-api-access-d5mfw\") pod \"metallb-operator-webhook-server-59546f7477-2w52w\" (UID: \"6faaefa8-4269-448f-90a9-b4af7b5b2eae\") " pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.115442 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6faaefa8-4269-448f-90a9-b4af7b5b2eae-apiservice-cert\") pod \"metallb-operator-webhook-server-59546f7477-2w52w\" (UID: \"6faaefa8-4269-448f-90a9-b4af7b5b2eae\") " pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.115468 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6faaefa8-4269-448f-90a9-b4af7b5b2eae-webhook-cert\") pod \"metallb-operator-webhook-server-59546f7477-2w52w\" (UID: \"6faaefa8-4269-448f-90a9-b4af7b5b2eae\") " pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.216334 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5mfw\" (UniqueName: \"kubernetes.io/projected/6faaefa8-4269-448f-90a9-b4af7b5b2eae-kube-api-access-d5mfw\") pod \"metallb-operator-webhook-server-59546f7477-2w52w\" (UID: \"6faaefa8-4269-448f-90a9-b4af7b5b2eae\") " pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.216397 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6faaefa8-4269-448f-90a9-b4af7b5b2eae-apiservice-cert\") pod \"metallb-operator-webhook-server-59546f7477-2w52w\" (UID: \"6faaefa8-4269-448f-90a9-b4af7b5b2eae\") " pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.216428 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6faaefa8-4269-448f-90a9-b4af7b5b2eae-webhook-cert\") pod \"metallb-operator-webhook-server-59546f7477-2w52w\" (UID: \"6faaefa8-4269-448f-90a9-b4af7b5b2eae\") " pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.221270 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6faaefa8-4269-448f-90a9-b4af7b5b2eae-apiservice-cert\") pod \"metallb-operator-webhook-server-59546f7477-2w52w\" (UID: \"6faaefa8-4269-448f-90a9-b4af7b5b2eae\") " pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.236377 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6faaefa8-4269-448f-90a9-b4af7b5b2eae-webhook-cert\") pod \"metallb-operator-webhook-server-59546f7477-2w52w\" (UID: \"6faaefa8-4269-448f-90a9-b4af7b5b2eae\") " pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.250140 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5mfw\" (UniqueName: \"kubernetes.io/projected/6faaefa8-4269-448f-90a9-b4af7b5b2eae-kube-api-access-d5mfw\") pod \"metallb-operator-webhook-server-59546f7477-2w52w\" (UID: \"6faaefa8-4269-448f-90a9-b4af7b5b2eae\") " pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.376330 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.559198 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-678f5df958-zlttq"] Feb 25 16:05:53 crc kubenswrapper[4937]: W0225 16:05:53.565445 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ad5751a_e32c_4f13_ab06_b3ddeb681961.slice/crio-b2d3784c88c1c0273c1e04876ae5ae2b11125b5d90cfb158506d2b9a0678d998 WatchSource:0}: Error finding container b2d3784c88c1c0273c1e04876ae5ae2b11125b5d90cfb158506d2b9a0678d998: Status 404 returned error can't find the container with id b2d3784c88c1c0273c1e04876ae5ae2b11125b5d90cfb158506d2b9a0678d998 Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.607741 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-59546f7477-2w52w"] Feb 25 16:05:53 crc kubenswrapper[4937]: W0225 16:05:53.613139 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6faaefa8_4269_448f_90a9_b4af7b5b2eae.slice/crio-497b10037d55f57188408c6dc2dbcb006d36fba84ff41b4c332d9ac3e51532e0 WatchSource:0}: Error finding container 497b10037d55f57188408c6dc2dbcb006d36fba84ff41b4c332d9ac3e51532e0: Status 404 returned error can't find the container with id 497b10037d55f57188408c6dc2dbcb006d36fba84ff41b4c332d9ac3e51532e0 Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.738997 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" event={"ID":"8ad5751a-e32c-4f13-ab06-b3ddeb681961","Type":"ContainerStarted","Data":"b2d3784c88c1c0273c1e04876ae5ae2b11125b5d90cfb158506d2b9a0678d998"} Feb 25 16:05:53 crc kubenswrapper[4937]: I0225 16:05:53.740065 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" event={"ID":"6faaefa8-4269-448f-90a9-b4af7b5b2eae","Type":"ContainerStarted","Data":"497b10037d55f57188408c6dc2dbcb006d36fba84ff41b4c332d9ac3e51532e0"} Feb 25 16:05:58 crc kubenswrapper[4937]: I0225 16:05:58.779106 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" event={"ID":"8ad5751a-e32c-4f13-ab06-b3ddeb681961","Type":"ContainerStarted","Data":"b00972a26735143b92e707fa3f6617050f6f47c6354be9504e242120cdf323a2"} Feb 25 16:05:58 crc kubenswrapper[4937]: I0225 16:05:58.779802 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" Feb 25 16:05:58 crc kubenswrapper[4937]: I0225 16:05:58.808234 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" podStartSLOduration=2.731599129 podStartE2EDuration="6.808219172s" podCreationTimestamp="2026-02-25 16:05:52 +0000 UTC" firstStartedPulling="2026-02-25 16:05:53.569471933 +0000 UTC m=+1204.582863823" lastFinishedPulling="2026-02-25 16:05:57.646091976 +0000 UTC m=+1208.659483866" observedRunningTime="2026-02-25 16:05:58.807517885 +0000 UTC m=+1209.820909785" watchObservedRunningTime="2026-02-25 16:05:58.808219172 +0000 UTC m=+1209.821611062" Feb 25 16:05:59 crc kubenswrapper[4937]: I0225 16:05:59.785613 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" event={"ID":"6faaefa8-4269-448f-90a9-b4af7b5b2eae","Type":"ContainerStarted","Data":"ad82e7580c8f16c33a555ae8c5d5ecb733736e25318de5da79b9dfabd7aff84f"} Feb 25 16:05:59 crc kubenswrapper[4937]: I0225 16:05:59.805130 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" podStartSLOduration=2.073606751 podStartE2EDuration="7.805115393s" podCreationTimestamp="2026-02-25 16:05:52 +0000 UTC" firstStartedPulling="2026-02-25 16:05:53.616208256 +0000 UTC m=+1204.629600146" lastFinishedPulling="2026-02-25 16:05:59.347716898 +0000 UTC m=+1210.361108788" observedRunningTime="2026-02-25 16:05:59.80259948 +0000 UTC m=+1210.815991370" watchObservedRunningTime="2026-02-25 16:05:59.805115393 +0000 UTC m=+1210.818507283" Feb 25 16:06:00 crc kubenswrapper[4937]: I0225 16:06:00.121413 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533926-mtjxm"] Feb 25 16:06:00 crc kubenswrapper[4937]: I0225 16:06:00.122192 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533926-mtjxm" Feb 25 16:06:00 crc kubenswrapper[4937]: I0225 16:06:00.123375 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lbbw\" (UniqueName: \"kubernetes.io/projected/e9e30f3c-7001-4619-9e1f-3ec0c825aed4-kube-api-access-2lbbw\") pod \"auto-csr-approver-29533926-mtjxm\" (UID: \"e9e30f3c-7001-4619-9e1f-3ec0c825aed4\") " pod="openshift-infra/auto-csr-approver-29533926-mtjxm" Feb 25 16:06:00 crc kubenswrapper[4937]: I0225 16:06:00.124994 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:06:00 crc kubenswrapper[4937]: I0225 16:06:00.125381 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:06:00 crc kubenswrapper[4937]: I0225 16:06:00.125948 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:06:00 crc kubenswrapper[4937]: I0225 16:06:00.140508 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533926-mtjxm"] Feb 25 16:06:00 crc kubenswrapper[4937]: I0225 16:06:00.224753 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lbbw\" (UniqueName: \"kubernetes.io/projected/e9e30f3c-7001-4619-9e1f-3ec0c825aed4-kube-api-access-2lbbw\") pod \"auto-csr-approver-29533926-mtjxm\" (UID: \"e9e30f3c-7001-4619-9e1f-3ec0c825aed4\") " pod="openshift-infra/auto-csr-approver-29533926-mtjxm" Feb 25 16:06:00 crc kubenswrapper[4937]: I0225 16:06:00.246821 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lbbw\" (UniqueName: \"kubernetes.io/projected/e9e30f3c-7001-4619-9e1f-3ec0c825aed4-kube-api-access-2lbbw\") pod \"auto-csr-approver-29533926-mtjxm\" (UID: \"e9e30f3c-7001-4619-9e1f-3ec0c825aed4\") " pod="openshift-infra/auto-csr-approver-29533926-mtjxm" Feb 25 16:06:00 crc kubenswrapper[4937]: I0225 16:06:00.448418 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533926-mtjxm" Feb 25 16:06:00 crc kubenswrapper[4937]: I0225 16:06:00.791568 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" Feb 25 16:06:00 crc kubenswrapper[4937]: I0225 16:06:00.890136 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533926-mtjxm"] Feb 25 16:06:01 crc kubenswrapper[4937]: I0225 16:06:01.798194 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533926-mtjxm" event={"ID":"e9e30f3c-7001-4619-9e1f-3ec0c825aed4","Type":"ContainerStarted","Data":"d0bd9c12b55a1b14b2535525f9429dc11c86a9e9f8e7912cc0b69efa9f379d09"} Feb 25 16:06:02 crc kubenswrapper[4937]: I0225 16:06:02.805538 4937 generic.go:334] "Generic (PLEG): container finished" podID="e9e30f3c-7001-4619-9e1f-3ec0c825aed4" containerID="79448ac60046cd6dbf14cd22e9c5ed8e94fbf29fd6cb045ed5751281c9ff6629" exitCode=0 Feb 25 16:06:02 crc kubenswrapper[4937]: I0225 16:06:02.805593 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533926-mtjxm" event={"ID":"e9e30f3c-7001-4619-9e1f-3ec0c825aed4","Type":"ContainerDied","Data":"79448ac60046cd6dbf14cd22e9c5ed8e94fbf29fd6cb045ed5751281c9ff6629"} Feb 25 16:06:04 crc kubenswrapper[4937]: I0225 16:06:04.119674 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533926-mtjxm" Feb 25 16:06:04 crc kubenswrapper[4937]: I0225 16:06:04.271844 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lbbw\" (UniqueName: \"kubernetes.io/projected/e9e30f3c-7001-4619-9e1f-3ec0c825aed4-kube-api-access-2lbbw\") pod \"e9e30f3c-7001-4619-9e1f-3ec0c825aed4\" (UID: \"e9e30f3c-7001-4619-9e1f-3ec0c825aed4\") " Feb 25 16:06:04 crc kubenswrapper[4937]: I0225 16:06:04.280764 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e30f3c-7001-4619-9e1f-3ec0c825aed4-kube-api-access-2lbbw" (OuterVolumeSpecName: "kube-api-access-2lbbw") pod "e9e30f3c-7001-4619-9e1f-3ec0c825aed4" (UID: "e9e30f3c-7001-4619-9e1f-3ec0c825aed4"). InnerVolumeSpecName "kube-api-access-2lbbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:06:04 crc kubenswrapper[4937]: I0225 16:06:04.373400 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lbbw\" (UniqueName: \"kubernetes.io/projected/e9e30f3c-7001-4619-9e1f-3ec0c825aed4-kube-api-access-2lbbw\") on node \"crc\" DevicePath \"\"" Feb 25 16:06:04 crc kubenswrapper[4937]: I0225 16:06:04.831283 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533926-mtjxm" event={"ID":"e9e30f3c-7001-4619-9e1f-3ec0c825aed4","Type":"ContainerDied","Data":"d0bd9c12b55a1b14b2535525f9429dc11c86a9e9f8e7912cc0b69efa9f379d09"} Feb 25 16:06:04 crc kubenswrapper[4937]: I0225 16:06:04.831331 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0bd9c12b55a1b14b2535525f9429dc11c86a9e9f8e7912cc0b69efa9f379d09" Feb 25 16:06:04 crc kubenswrapper[4937]: I0225 16:06:04.831731 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533926-mtjxm" Feb 25 16:06:05 crc kubenswrapper[4937]: I0225 16:06:05.219359 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533920-8jlbb"] Feb 25 16:06:05 crc kubenswrapper[4937]: I0225 16:06:05.222836 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533920-8jlbb"] Feb 25 16:06:05 crc kubenswrapper[4937]: I0225 16:06:05.376741 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="470f17dd-ee35-4eb9-b7d2-2815acdc1b9c" path="/var/lib/kubelet/pods/470f17dd-ee35-4eb9-b7d2-2815acdc1b9c/volumes" Feb 25 16:06:13 crc kubenswrapper[4937]: I0225 16:06:13.381326 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-59546f7477-2w52w" Feb 25 16:06:21 crc kubenswrapper[4937]: I0225 16:06:21.942478 4937 scope.go:117] "RemoveContainer" containerID="92e63649e314d4809a104909b649a75fea923aa2e67fb276105f8f47f22b467d" Feb 25 16:06:33 crc kubenswrapper[4937]: I0225 16:06:33.099950 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-678f5df958-zlttq" Feb 25 16:06:33 crc kubenswrapper[4937]: I0225 16:06:33.929280 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj"] Feb 25 16:06:33 crc kubenswrapper[4937]: E0225 16:06:33.929895 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e30f3c-7001-4619-9e1f-3ec0c825aed4" containerName="oc" Feb 25 16:06:33 crc kubenswrapper[4937]: I0225 16:06:33.929919 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e30f3c-7001-4619-9e1f-3ec0c825aed4" containerName="oc" Feb 25 16:06:33 crc kubenswrapper[4937]: I0225 16:06:33.930038 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9e30f3c-7001-4619-9e1f-3ec0c825aed4" containerName="oc" Feb 25 16:06:33 crc kubenswrapper[4937]: I0225 16:06:33.930466 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj" Feb 25 16:06:33 crc kubenswrapper[4937]: I0225 16:06:33.933292 4937 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 25 16:06:33 crc kubenswrapper[4937]: I0225 16:06:33.935166 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-qlk4x"] Feb 25 16:06:33 crc kubenswrapper[4937]: I0225 16:06:33.935368 4937 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-b8mcw" Feb 25 16:06:33 crc kubenswrapper[4937]: I0225 16:06:33.938514 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:33 crc kubenswrapper[4937]: I0225 16:06:33.940936 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 25 16:06:33 crc kubenswrapper[4937]: I0225 16:06:33.950828 4937 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 25 16:06:33 crc kubenswrapper[4937]: I0225 16:06:33.978888 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj"] Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.038307 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-vpqx7"] Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.041677 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vpqx7" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.047405 4937 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-nt2pj" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.047654 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.047821 4937 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.047989 4937 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.060443 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-5vzl9"] Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.070553 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-5vzl9" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.074758 4937 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.087350 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-5vzl9"] Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.090294 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/96b71f98-1da6-4122-828b-1d58fd8e40d3-frr-startup\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.090358 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3b9485a-9a4f-467b-9e99-e858b7b47a8b-cert\") pod \"frr-k8s-webhook-server-7f989f654f-zl6xj\" (UID: \"f3b9485a-9a4f-467b-9e99-e858b7b47a8b\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.090393 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/96b71f98-1da6-4122-828b-1d58fd8e40d3-metrics\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.090426 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng764\" (UniqueName: \"kubernetes.io/projected/f3b9485a-9a4f-467b-9e99-e858b7b47a8b-kube-api-access-ng764\") pod \"frr-k8s-webhook-server-7f989f654f-zl6xj\" (UID: \"f3b9485a-9a4f-467b-9e99-e858b7b47a8b\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.090452 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/96b71f98-1da6-4122-828b-1d58fd8e40d3-frr-conf\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.090506 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96b71f98-1da6-4122-828b-1d58fd8e40d3-metrics-certs\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.090565 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/96b71f98-1da6-4122-828b-1d58fd8e40d3-frr-sockets\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.090591 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/96b71f98-1da6-4122-828b-1d58fd8e40d3-reloader\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.090615 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvdlt\" (UniqueName: \"kubernetes.io/projected/96b71f98-1da6-4122-828b-1d58fd8e40d3-kube-api-access-dvdlt\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.191741 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/96b71f98-1da6-4122-828b-1d58fd8e40d3-frr-startup\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.191794 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8c24e8b5-c791-4ceb-9258-fba04c4adf91-metallb-excludel2\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.191819 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3b9485a-9a4f-467b-9e99-e858b7b47a8b-cert\") pod \"frr-k8s-webhook-server-7f989f654f-zl6xj\" (UID: \"f3b9485a-9a4f-467b-9e99-e858b7b47a8b\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.191835 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-memberlist\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.191851 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjv5p\" (UniqueName: \"kubernetes.io/projected/8c24e8b5-c791-4ceb-9258-fba04c4adf91-kube-api-access-tjv5p\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.191872 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/96b71f98-1da6-4122-828b-1d58fd8e40d3-metrics\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.191898 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng764\" (UniqueName: \"kubernetes.io/projected/f3b9485a-9a4f-467b-9e99-e858b7b47a8b-kube-api-access-ng764\") pod \"frr-k8s-webhook-server-7f989f654f-zl6xj\" (UID: \"f3b9485a-9a4f-467b-9e99-e858b7b47a8b\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.191915 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-metrics-certs\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.192128 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/96b71f98-1da6-4122-828b-1d58fd8e40d3-frr-conf\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.192242 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96b71f98-1da6-4122-828b-1d58fd8e40d3-metrics-certs\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.192287 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl2q4\" (UniqueName: \"kubernetes.io/projected/53cf6067-7864-4449-9f64-2cf8181fec1d-kube-api-access-xl2q4\") pod \"controller-86ddb6bd46-5vzl9\" (UID: \"53cf6067-7864-4449-9f64-2cf8181fec1d\") " pod="metallb-system/controller-86ddb6bd46-5vzl9" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.192314 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/96b71f98-1da6-4122-828b-1d58fd8e40d3-metrics\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.192334 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/96b71f98-1da6-4122-828b-1d58fd8e40d3-frr-sockets\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.192360 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53cf6067-7864-4449-9f64-2cf8181fec1d-metrics-certs\") pod \"controller-86ddb6bd46-5vzl9\" (UID: \"53cf6067-7864-4449-9f64-2cf8181fec1d\") " pod="metallb-system/controller-86ddb6bd46-5vzl9" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.192384 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53cf6067-7864-4449-9f64-2cf8181fec1d-cert\") pod \"controller-86ddb6bd46-5vzl9\" (UID: \"53cf6067-7864-4449-9f64-2cf8181fec1d\") " pod="metallb-system/controller-86ddb6bd46-5vzl9" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.192409 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/96b71f98-1da6-4122-828b-1d58fd8e40d3-reloader\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.192433 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvdlt\" (UniqueName: \"kubernetes.io/projected/96b71f98-1da6-4122-828b-1d58fd8e40d3-kube-api-access-dvdlt\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.192636 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/96b71f98-1da6-4122-828b-1d58fd8e40d3-frr-sockets\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.192646 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/96b71f98-1da6-4122-828b-1d58fd8e40d3-frr-conf\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.192766 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/96b71f98-1da6-4122-828b-1d58fd8e40d3-reloader\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.193006 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/96b71f98-1da6-4122-828b-1d58fd8e40d3-frr-startup\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.197312 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96b71f98-1da6-4122-828b-1d58fd8e40d3-metrics-certs\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.197619 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3b9485a-9a4f-467b-9e99-e858b7b47a8b-cert\") pod \"frr-k8s-webhook-server-7f989f654f-zl6xj\" (UID: \"f3b9485a-9a4f-467b-9e99-e858b7b47a8b\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.206757 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvdlt\" (UniqueName: \"kubernetes.io/projected/96b71f98-1da6-4122-828b-1d58fd8e40d3-kube-api-access-dvdlt\") pod \"frr-k8s-qlk4x\" (UID: \"96b71f98-1da6-4122-828b-1d58fd8e40d3\") " pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.208416 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng764\" (UniqueName: \"kubernetes.io/projected/f3b9485a-9a4f-467b-9e99-e858b7b47a8b-kube-api-access-ng764\") pod \"frr-k8s-webhook-server-7f989f654f-zl6xj\" (UID: \"f3b9485a-9a4f-467b-9e99-e858b7b47a8b\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.250392 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.261600 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.293831 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53cf6067-7864-4449-9f64-2cf8181fec1d-metrics-certs\") pod \"controller-86ddb6bd46-5vzl9\" (UID: \"53cf6067-7864-4449-9f64-2cf8181fec1d\") " pod="metallb-system/controller-86ddb6bd46-5vzl9" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.293891 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53cf6067-7864-4449-9f64-2cf8181fec1d-cert\") pod \"controller-86ddb6bd46-5vzl9\" (UID: \"53cf6067-7864-4449-9f64-2cf8181fec1d\") " pod="metallb-system/controller-86ddb6bd46-5vzl9" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.293936 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8c24e8b5-c791-4ceb-9258-fba04c4adf91-metallb-excludel2\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.293956 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-memberlist\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.293971 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjv5p\" (UniqueName: \"kubernetes.io/projected/8c24e8b5-c791-4ceb-9258-fba04c4adf91-kube-api-access-tjv5p\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.294006 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-metrics-certs\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.294029 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl2q4\" (UniqueName: \"kubernetes.io/projected/53cf6067-7864-4449-9f64-2cf8181fec1d-kube-api-access-xl2q4\") pod \"controller-86ddb6bd46-5vzl9\" (UID: \"53cf6067-7864-4449-9f64-2cf8181fec1d\") " pod="metallb-system/controller-86ddb6bd46-5vzl9" Feb 25 16:06:34 crc kubenswrapper[4937]: E0225 16:06:34.294433 4937 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 25 16:06:34 crc kubenswrapper[4937]: E0225 16:06:34.294533 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-memberlist podName:8c24e8b5-c791-4ceb-9258-fba04c4adf91 nodeName:}" failed. No retries permitted until 2026-02-25 16:06:34.794511768 +0000 UTC m=+1245.807903678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-memberlist") pod "speaker-vpqx7" (UID: "8c24e8b5-c791-4ceb-9258-fba04c4adf91") : secret "metallb-memberlist" not found Feb 25 16:06:34 crc kubenswrapper[4937]: E0225 16:06:34.294721 4937 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 25 16:06:34 crc kubenswrapper[4937]: E0225 16:06:34.294752 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-metrics-certs podName:8c24e8b5-c791-4ceb-9258-fba04c4adf91 nodeName:}" failed. No retries permitted until 2026-02-25 16:06:34.794742984 +0000 UTC m=+1245.808134884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-metrics-certs") pod "speaker-vpqx7" (UID: "8c24e8b5-c791-4ceb-9258-fba04c4adf91") : secret "speaker-certs-secret" not found Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.294938 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8c24e8b5-c791-4ceb-9258-fba04c4adf91-metallb-excludel2\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.300121 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53cf6067-7864-4449-9f64-2cf8181fec1d-metrics-certs\") pod \"controller-86ddb6bd46-5vzl9\" (UID: \"53cf6067-7864-4449-9f64-2cf8181fec1d\") " pod="metallb-system/controller-86ddb6bd46-5vzl9" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.300497 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53cf6067-7864-4449-9f64-2cf8181fec1d-cert\") pod \"controller-86ddb6bd46-5vzl9\" (UID: \"53cf6067-7864-4449-9f64-2cf8181fec1d\") " pod="metallb-system/controller-86ddb6bd46-5vzl9" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.309183 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjv5p\" (UniqueName: \"kubernetes.io/projected/8c24e8b5-c791-4ceb-9258-fba04c4adf91-kube-api-access-tjv5p\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.312442 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl2q4\" (UniqueName: \"kubernetes.io/projected/53cf6067-7864-4449-9f64-2cf8181fec1d-kube-api-access-xl2q4\") pod \"controller-86ddb6bd46-5vzl9\" (UID: \"53cf6067-7864-4449-9f64-2cf8181fec1d\") " pod="metallb-system/controller-86ddb6bd46-5vzl9" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.385142 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-5vzl9" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.707990 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj"] Feb 25 16:06:34 crc kubenswrapper[4937]: W0225 16:06:34.715565 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3b9485a_9a4f_467b_9e99_e858b7b47a8b.slice/crio-28b4d2cf0b49e5aa9e8268008d5e333708a17ef06f60a8d2cff052270d96b500 WatchSource:0}: Error finding container 28b4d2cf0b49e5aa9e8268008d5e333708a17ef06f60a8d2cff052270d96b500: Status 404 returned error can't find the container with id 28b4d2cf0b49e5aa9e8268008d5e333708a17ef06f60a8d2cff052270d96b500 Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.791160 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-5vzl9"] Feb 25 16:06:34 crc kubenswrapper[4937]: W0225 16:06:34.799896 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53cf6067_7864_4449_9f64_2cf8181fec1d.slice/crio-3b362b62b83c416f0312b3d3b62dac87b865643b3845d016085a5b19cd06b10f WatchSource:0}: Error finding container 3b362b62b83c416f0312b3d3b62dac87b865643b3845d016085a5b19cd06b10f: Status 404 returned error can't find the container with id 3b362b62b83c416f0312b3d3b62dac87b865643b3845d016085a5b19cd06b10f Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.801559 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-memberlist\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.801610 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-metrics-certs\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:34 crc kubenswrapper[4937]: E0225 16:06:34.801720 4937 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 25 16:06:34 crc kubenswrapper[4937]: E0225 16:06:34.801785 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-memberlist podName:8c24e8b5-c791-4ceb-9258-fba04c4adf91 nodeName:}" failed. No retries permitted until 2026-02-25 16:06:35.801771128 +0000 UTC m=+1246.815163018 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-memberlist") pod "speaker-vpqx7" (UID: "8c24e8b5-c791-4ceb-9258-fba04c4adf91") : secret "metallb-memberlist" not found Feb 25 16:06:34 crc kubenswrapper[4937]: I0225 16:06:34.808953 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-metrics-certs\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:35 crc kubenswrapper[4937]: I0225 16:06:35.046125 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qlk4x" event={"ID":"96b71f98-1da6-4122-828b-1d58fd8e40d3","Type":"ContainerStarted","Data":"f4c458ced6e4f396ec48cf52d23bf83a4054e6355d2a9d5fe74d821601ae7d30"} Feb 25 16:06:35 crc kubenswrapper[4937]: I0225 16:06:35.048224 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-5vzl9" event={"ID":"53cf6067-7864-4449-9f64-2cf8181fec1d","Type":"ContainerStarted","Data":"4186341da4aa142007fb5a1d73ce0abb9b5730c19da892a5bf34694164be5eae"} Feb 25 16:06:35 crc kubenswrapper[4937]: I0225 16:06:35.048250 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-5vzl9" event={"ID":"53cf6067-7864-4449-9f64-2cf8181fec1d","Type":"ContainerStarted","Data":"7612cca356c0d589e29db8bb8078de7298d8ed03618f2e11dd6183172d58cef7"} Feb 25 16:06:35 crc kubenswrapper[4937]: I0225 16:06:35.048259 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-5vzl9" event={"ID":"53cf6067-7864-4449-9f64-2cf8181fec1d","Type":"ContainerStarted","Data":"3b362b62b83c416f0312b3d3b62dac87b865643b3845d016085a5b19cd06b10f"} Feb 25 16:06:35 crc kubenswrapper[4937]: I0225 16:06:35.048375 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-5vzl9" Feb 25 16:06:35 crc kubenswrapper[4937]: I0225 16:06:35.049389 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj" event={"ID":"f3b9485a-9a4f-467b-9e99-e858b7b47a8b","Type":"ContainerStarted","Data":"28b4d2cf0b49e5aa9e8268008d5e333708a17ef06f60a8d2cff052270d96b500"} Feb 25 16:06:35 crc kubenswrapper[4937]: I0225 16:06:35.853361 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-memberlist\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:35 crc kubenswrapper[4937]: I0225 16:06:35.869123 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8c24e8b5-c791-4ceb-9258-fba04c4adf91-memberlist\") pod \"speaker-vpqx7\" (UID: \"8c24e8b5-c791-4ceb-9258-fba04c4adf91\") " pod="metallb-system/speaker-vpqx7" Feb 25 16:06:35 crc kubenswrapper[4937]: I0225 16:06:35.869468 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vpqx7" Feb 25 16:06:35 crc kubenswrapper[4937]: W0225 16:06:35.898202 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c24e8b5_c791_4ceb_9258_fba04c4adf91.slice/crio-b071b6d0c849649c0c70dcce4c5fb8e32e54b829d5e910e559870584354ff954 WatchSource:0}: Error finding container b071b6d0c849649c0c70dcce4c5fb8e32e54b829d5e910e559870584354ff954: Status 404 returned error can't find the container with id b071b6d0c849649c0c70dcce4c5fb8e32e54b829d5e910e559870584354ff954 Feb 25 16:06:36 crc kubenswrapper[4937]: I0225 16:06:36.058694 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vpqx7" event={"ID":"8c24e8b5-c791-4ceb-9258-fba04c4adf91","Type":"ContainerStarted","Data":"b071b6d0c849649c0c70dcce4c5fb8e32e54b829d5e910e559870584354ff954"} Feb 25 16:06:37 crc kubenswrapper[4937]: I0225 16:06:37.066236 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vpqx7" event={"ID":"8c24e8b5-c791-4ceb-9258-fba04c4adf91","Type":"ContainerStarted","Data":"b0ff3fb404012c14229a1fa5da234ad1e0d064d1dd86e57cadd6fea042db159d"} Feb 25 16:06:38 crc kubenswrapper[4937]: I0225 16:06:38.078189 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vpqx7" event={"ID":"8c24e8b5-c791-4ceb-9258-fba04c4adf91","Type":"ContainerStarted","Data":"1dcbdec5cb52b718b2300272e32a9ee370fd26cc1f96cbff97120b972bf438e0"} Feb 25 16:06:38 crc kubenswrapper[4937]: I0225 16:06:38.078470 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-vpqx7" Feb 25 16:06:38 crc kubenswrapper[4937]: I0225 16:06:38.101246 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-5vzl9" podStartSLOduration=4.101228057 podStartE2EDuration="4.101228057s" podCreationTimestamp="2026-02-25 16:06:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:06:35.063679811 +0000 UTC m=+1246.077071701" watchObservedRunningTime="2026-02-25 16:06:38.101228057 +0000 UTC m=+1249.114619947" Feb 25 16:06:38 crc kubenswrapper[4937]: I0225 16:06:38.105920 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-vpqx7" podStartSLOduration=4.105905035 podStartE2EDuration="4.105905035s" podCreationTimestamp="2026-02-25 16:06:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:06:38.100612842 +0000 UTC m=+1249.114004742" watchObservedRunningTime="2026-02-25 16:06:38.105905035 +0000 UTC m=+1249.119296925" Feb 25 16:06:41 crc kubenswrapper[4937]: I0225 16:06:41.495050 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:06:41 crc kubenswrapper[4937]: I0225 16:06:41.495480 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:06:44 crc kubenswrapper[4937]: I0225 16:06:44.127052 4937 generic.go:334] "Generic (PLEG): container finished" podID="96b71f98-1da6-4122-828b-1d58fd8e40d3" containerID="1cc74bbf55e373ee4df2c46e876416ed996d094973ea8d7c7f45a6274560e5c1" exitCode=0 Feb 25 16:06:44 crc kubenswrapper[4937]: I0225 16:06:44.127179 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qlk4x" event={"ID":"96b71f98-1da6-4122-828b-1d58fd8e40d3","Type":"ContainerDied","Data":"1cc74bbf55e373ee4df2c46e876416ed996d094973ea8d7c7f45a6274560e5c1"} Feb 25 16:06:44 crc kubenswrapper[4937]: I0225 16:06:44.131948 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj" event={"ID":"f3b9485a-9a4f-467b-9e99-e858b7b47a8b","Type":"ContainerStarted","Data":"b6a274ca78b36f3a51c102799843850a54b72200b5e577d064f4a03f7389f16b"} Feb 25 16:06:44 crc kubenswrapper[4937]: I0225 16:06:44.132121 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj" Feb 25 16:06:44 crc kubenswrapper[4937]: I0225 16:06:44.188285 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj" podStartSLOduration=1.968029145 podStartE2EDuration="11.18826282s" podCreationTimestamp="2026-02-25 16:06:33 +0000 UTC" firstStartedPulling="2026-02-25 16:06:34.717402997 +0000 UTC m=+1245.730794887" lastFinishedPulling="2026-02-25 16:06:43.937636672 +0000 UTC m=+1254.951028562" observedRunningTime="2026-02-25 16:06:44.184951147 +0000 UTC m=+1255.198343047" watchObservedRunningTime="2026-02-25 16:06:44.18826282 +0000 UTC m=+1255.201654710" Feb 25 16:06:44 crc kubenswrapper[4937]: I0225 16:06:44.390998 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-5vzl9" Feb 25 16:06:45 crc kubenswrapper[4937]: I0225 16:06:45.140515 4937 generic.go:334] "Generic (PLEG): container finished" podID="96b71f98-1da6-4122-828b-1d58fd8e40d3" containerID="5f26bf1d57902eec5585fbaef461aad11f431961e24730916dcf4dde3d712633" exitCode=0 Feb 25 16:06:45 crc kubenswrapper[4937]: I0225 16:06:45.140704 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qlk4x" event={"ID":"96b71f98-1da6-4122-828b-1d58fd8e40d3","Type":"ContainerDied","Data":"5f26bf1d57902eec5585fbaef461aad11f431961e24730916dcf4dde3d712633"} Feb 25 16:06:46 crc kubenswrapper[4937]: I0225 16:06:46.149290 4937 generic.go:334] "Generic (PLEG): container finished" podID="96b71f98-1da6-4122-828b-1d58fd8e40d3" containerID="e03e6bec90bfff494f3bcde9dd41f94f016ef5b169c8066f307d0fd3dc2b36bd" exitCode=0 Feb 25 16:06:46 crc kubenswrapper[4937]: I0225 16:06:46.149336 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qlk4x" event={"ID":"96b71f98-1da6-4122-828b-1d58fd8e40d3","Type":"ContainerDied","Data":"e03e6bec90bfff494f3bcde9dd41f94f016ef5b169c8066f307d0fd3dc2b36bd"} Feb 25 16:06:47 crc kubenswrapper[4937]: I0225 16:06:47.163504 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qlk4x" event={"ID":"96b71f98-1da6-4122-828b-1d58fd8e40d3","Type":"ContainerStarted","Data":"07d26514b51b4b33ffad10732142a69c702b21dea3fa9b20e1a7f89ab9007c0f"} Feb 25 16:06:47 crc kubenswrapper[4937]: I0225 16:06:47.163822 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qlk4x" event={"ID":"96b71f98-1da6-4122-828b-1d58fd8e40d3","Type":"ContainerStarted","Data":"87d5ccab01eedc6aedf66fe48fcfea7011a1aebe3f9d9b610a6e9ace1ff84acd"} Feb 25 16:06:47 crc kubenswrapper[4937]: I0225 16:06:47.163834 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qlk4x" event={"ID":"96b71f98-1da6-4122-828b-1d58fd8e40d3","Type":"ContainerStarted","Data":"3b2b773c8015a2aa1f42faae3550d7f963e76ad838b66bdfa38c59411a30401e"} Feb 25 16:06:47 crc kubenswrapper[4937]: I0225 16:06:47.163846 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qlk4x" event={"ID":"96b71f98-1da6-4122-828b-1d58fd8e40d3","Type":"ContainerStarted","Data":"c9482ea4f1f4b057172b40ca634ff8d1158c208fd200bf90fb4e7c888a2fb045"} Feb 25 16:06:47 crc kubenswrapper[4937]: I0225 16:06:47.163857 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qlk4x" event={"ID":"96b71f98-1da6-4122-828b-1d58fd8e40d3","Type":"ContainerStarted","Data":"283a75246f69bc19eedac81093e192fdc76d94b002e9db7925b10a773f1d8129"} Feb 25 16:06:48 crc kubenswrapper[4937]: I0225 16:06:48.178445 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qlk4x" event={"ID":"96b71f98-1da6-4122-828b-1d58fd8e40d3","Type":"ContainerStarted","Data":"3925e4b24e6399721f53878c2ca0aa85058250cd23c7e7f291ce5d0d538d5ef3"} Feb 25 16:06:48 crc kubenswrapper[4937]: I0225 16:06:48.178759 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:48 crc kubenswrapper[4937]: I0225 16:06:48.209094 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-qlk4x" podStartSLOduration=5.731540739 podStartE2EDuration="15.209075601s" podCreationTimestamp="2026-02-25 16:06:33 +0000 UTC" firstStartedPulling="2026-02-25 16:06:34.44100174 +0000 UTC m=+1245.454393630" lastFinishedPulling="2026-02-25 16:06:43.918536602 +0000 UTC m=+1254.931928492" observedRunningTime="2026-02-25 16:06:48.204108386 +0000 UTC m=+1259.217500266" watchObservedRunningTime="2026-02-25 16:06:48.209075601 +0000 UTC m=+1259.222467491" Feb 25 16:06:49 crc kubenswrapper[4937]: I0225 16:06:49.261181 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:49 crc kubenswrapper[4937]: I0225 16:06:49.335274 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:06:54 crc kubenswrapper[4937]: I0225 16:06:54.257432 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-zl6xj" Feb 25 16:06:55 crc kubenswrapper[4937]: I0225 16:06:55.873343 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-vpqx7" Feb 25 16:06:58 crc kubenswrapper[4937]: I0225 16:06:58.760473 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6gjsq"] Feb 25 16:06:58 crc kubenswrapper[4937]: I0225 16:06:58.761585 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6gjsq" Feb 25 16:06:58 crc kubenswrapper[4937]: I0225 16:06:58.765205 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 25 16:06:58 crc kubenswrapper[4937]: I0225 16:06:58.765477 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-zg4jm" Feb 25 16:06:58 crc kubenswrapper[4937]: I0225 16:06:58.765837 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 25 16:06:58 crc kubenswrapper[4937]: I0225 16:06:58.796372 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6gjsq"] Feb 25 16:06:58 crc kubenswrapper[4937]: I0225 16:06:58.945287 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcv45\" (UniqueName: \"kubernetes.io/projected/7417c402-f7b0-4e09-80e8-54ea12ec15a4-kube-api-access-wcv45\") pod \"openstack-operator-index-6gjsq\" (UID: \"7417c402-f7b0-4e09-80e8-54ea12ec15a4\") " pod="openstack-operators/openstack-operator-index-6gjsq" Feb 25 16:06:59 crc kubenswrapper[4937]: I0225 16:06:59.047131 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcv45\" (UniqueName: \"kubernetes.io/projected/7417c402-f7b0-4e09-80e8-54ea12ec15a4-kube-api-access-wcv45\") pod \"openstack-operator-index-6gjsq\" (UID: \"7417c402-f7b0-4e09-80e8-54ea12ec15a4\") " pod="openstack-operators/openstack-operator-index-6gjsq" Feb 25 16:06:59 crc kubenswrapper[4937]: I0225 16:06:59.071854 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcv45\" (UniqueName: \"kubernetes.io/projected/7417c402-f7b0-4e09-80e8-54ea12ec15a4-kube-api-access-wcv45\") pod \"openstack-operator-index-6gjsq\" (UID: \"7417c402-f7b0-4e09-80e8-54ea12ec15a4\") " pod="openstack-operators/openstack-operator-index-6gjsq" Feb 25 16:06:59 crc kubenswrapper[4937]: I0225 16:06:59.083836 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6gjsq" Feb 25 16:06:59 crc kubenswrapper[4937]: I0225 16:06:59.586123 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6gjsq"] Feb 25 16:06:59 crc kubenswrapper[4937]: W0225 16:06:59.594302 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7417c402_f7b0_4e09_80e8_54ea12ec15a4.slice/crio-e9993cfe63649016927e82f7671bcc72228c4631210b8310ae6b0c0b41faec31 WatchSource:0}: Error finding container e9993cfe63649016927e82f7671bcc72228c4631210b8310ae6b0c0b41faec31: Status 404 returned error can't find the container with id e9993cfe63649016927e82f7671bcc72228c4631210b8310ae6b0c0b41faec31 Feb 25 16:06:59 crc kubenswrapper[4937]: I0225 16:06:59.596594 4937 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 16:07:00 crc kubenswrapper[4937]: I0225 16:07:00.265251 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6gjsq" event={"ID":"7417c402-f7b0-4e09-80e8-54ea12ec15a4","Type":"ContainerStarted","Data":"e9993cfe63649016927e82f7671bcc72228c4631210b8310ae6b0c0b41faec31"} Feb 25 16:07:01 crc kubenswrapper[4937]: I0225 16:07:01.336442 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-6gjsq"] Feb 25 16:07:01 crc kubenswrapper[4937]: I0225 16:07:01.739647 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7hxvd"] Feb 25 16:07:01 crc kubenswrapper[4937]: I0225 16:07:01.740757 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7hxvd" Feb 25 16:07:01 crc kubenswrapper[4937]: I0225 16:07:01.750911 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7hxvd"] Feb 25 16:07:01 crc kubenswrapper[4937]: I0225 16:07:01.891386 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd4jj\" (UniqueName: \"kubernetes.io/projected/65bfe4a4-8d7d-48bc-823a-5b388022052f-kube-api-access-pd4jj\") pod \"openstack-operator-index-7hxvd\" (UID: \"65bfe4a4-8d7d-48bc-823a-5b388022052f\") " pod="openstack-operators/openstack-operator-index-7hxvd" Feb 25 16:07:01 crc kubenswrapper[4937]: I0225 16:07:01.993467 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd4jj\" (UniqueName: \"kubernetes.io/projected/65bfe4a4-8d7d-48bc-823a-5b388022052f-kube-api-access-pd4jj\") pod \"openstack-operator-index-7hxvd\" (UID: \"65bfe4a4-8d7d-48bc-823a-5b388022052f\") " pod="openstack-operators/openstack-operator-index-7hxvd" Feb 25 16:07:02 crc kubenswrapper[4937]: I0225 16:07:02.013650 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd4jj\" (UniqueName: \"kubernetes.io/projected/65bfe4a4-8d7d-48bc-823a-5b388022052f-kube-api-access-pd4jj\") pod \"openstack-operator-index-7hxvd\" (UID: \"65bfe4a4-8d7d-48bc-823a-5b388022052f\") " pod="openstack-operators/openstack-operator-index-7hxvd" Feb 25 16:07:02 crc kubenswrapper[4937]: I0225 16:07:02.101694 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7hxvd" Feb 25 16:07:03 crc kubenswrapper[4937]: I0225 16:07:03.283790 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6gjsq" event={"ID":"7417c402-f7b0-4e09-80e8-54ea12ec15a4","Type":"ContainerStarted","Data":"dac4e5621cf576701033246fff5963bb6883a9ab86965d4de3dc82bafa022dc6"} Feb 25 16:07:03 crc kubenswrapper[4937]: I0225 16:07:03.284402 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-6gjsq" podUID="7417c402-f7b0-4e09-80e8-54ea12ec15a4" containerName="registry-server" containerID="cri-o://dac4e5621cf576701033246fff5963bb6883a9ab86965d4de3dc82bafa022dc6" gracePeriod=2 Feb 25 16:07:03 crc kubenswrapper[4937]: I0225 16:07:03.304729 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-6gjsq" podStartSLOduration=1.8317242679999999 podStartE2EDuration="5.304712369s" podCreationTimestamp="2026-02-25 16:06:58 +0000 UTC" firstStartedPulling="2026-02-25 16:06:59.596375933 +0000 UTC m=+1270.609767823" lastFinishedPulling="2026-02-25 16:07:03.069364034 +0000 UTC m=+1274.082755924" observedRunningTime="2026-02-25 16:07:03.300910534 +0000 UTC m=+1274.314302424" watchObservedRunningTime="2026-02-25 16:07:03.304712369 +0000 UTC m=+1274.318104259" Feb 25 16:07:03 crc kubenswrapper[4937]: I0225 16:07:03.415593 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7hxvd"] Feb 25 16:07:03 crc kubenswrapper[4937]: W0225 16:07:03.417862 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65bfe4a4_8d7d_48bc_823a_5b388022052f.slice/crio-667d373b13db99f08e40bb463a6321f88095f022f1735eb5998d5f0843042912 WatchSource:0}: Error finding container 667d373b13db99f08e40bb463a6321f88095f022f1735eb5998d5f0843042912: Status 404 returned error can't find the container with id 667d373b13db99f08e40bb463a6321f88095f022f1735eb5998d5f0843042912 Feb 25 16:07:03 crc kubenswrapper[4937]: I0225 16:07:03.658735 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6gjsq" Feb 25 16:07:03 crc kubenswrapper[4937]: I0225 16:07:03.829092 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcv45\" (UniqueName: \"kubernetes.io/projected/7417c402-f7b0-4e09-80e8-54ea12ec15a4-kube-api-access-wcv45\") pod \"7417c402-f7b0-4e09-80e8-54ea12ec15a4\" (UID: \"7417c402-f7b0-4e09-80e8-54ea12ec15a4\") " Feb 25 16:07:03 crc kubenswrapper[4937]: I0225 16:07:03.834241 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7417c402-f7b0-4e09-80e8-54ea12ec15a4-kube-api-access-wcv45" (OuterVolumeSpecName: "kube-api-access-wcv45") pod "7417c402-f7b0-4e09-80e8-54ea12ec15a4" (UID: "7417c402-f7b0-4e09-80e8-54ea12ec15a4"). InnerVolumeSpecName "kube-api-access-wcv45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:07:03 crc kubenswrapper[4937]: I0225 16:07:03.930615 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcv45\" (UniqueName: \"kubernetes.io/projected/7417c402-f7b0-4e09-80e8-54ea12ec15a4-kube-api-access-wcv45\") on node \"crc\" DevicePath \"\"" Feb 25 16:07:04 crc kubenswrapper[4937]: I0225 16:07:04.264582 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-qlk4x" Feb 25 16:07:04 crc kubenswrapper[4937]: I0225 16:07:04.297286 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7hxvd" event={"ID":"65bfe4a4-8d7d-48bc-823a-5b388022052f","Type":"ContainerStarted","Data":"31fae17b5e4c22ac178d81dbcb60965c763bc453bafd0b8d21996d4ee9d79702"} Feb 25 16:07:04 crc kubenswrapper[4937]: I0225 16:07:04.297336 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7hxvd" event={"ID":"65bfe4a4-8d7d-48bc-823a-5b388022052f","Type":"ContainerStarted","Data":"667d373b13db99f08e40bb463a6321f88095f022f1735eb5998d5f0843042912"} Feb 25 16:07:04 crc kubenswrapper[4937]: I0225 16:07:04.301062 4937 generic.go:334] "Generic (PLEG): container finished" podID="7417c402-f7b0-4e09-80e8-54ea12ec15a4" containerID="dac4e5621cf576701033246fff5963bb6883a9ab86965d4de3dc82bafa022dc6" exitCode=0 Feb 25 16:07:04 crc kubenswrapper[4937]: I0225 16:07:04.301122 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6gjsq" event={"ID":"7417c402-f7b0-4e09-80e8-54ea12ec15a4","Type":"ContainerDied","Data":"dac4e5621cf576701033246fff5963bb6883a9ab86965d4de3dc82bafa022dc6"} Feb 25 16:07:04 crc kubenswrapper[4937]: I0225 16:07:04.301129 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6gjsq" Feb 25 16:07:04 crc kubenswrapper[4937]: I0225 16:07:04.301154 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6gjsq" event={"ID":"7417c402-f7b0-4e09-80e8-54ea12ec15a4","Type":"ContainerDied","Data":"e9993cfe63649016927e82f7671bcc72228c4631210b8310ae6b0c0b41faec31"} Feb 25 16:07:04 crc kubenswrapper[4937]: I0225 16:07:04.301188 4937 scope.go:117] "RemoveContainer" containerID="dac4e5621cf576701033246fff5963bb6883a9ab86965d4de3dc82bafa022dc6" Feb 25 16:07:04 crc kubenswrapper[4937]: I0225 16:07:04.323716 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7hxvd" podStartSLOduration=3.267793716 podStartE2EDuration="3.323685771s" podCreationTimestamp="2026-02-25 16:07:01 +0000 UTC" firstStartedPulling="2026-02-25 16:07:03.423268509 +0000 UTC m=+1274.436660399" lastFinishedPulling="2026-02-25 16:07:03.479160554 +0000 UTC m=+1274.492552454" observedRunningTime="2026-02-25 16:07:04.31131762 +0000 UTC m=+1275.324709520" watchObservedRunningTime="2026-02-25 16:07:04.323685771 +0000 UTC m=+1275.337077661" Feb 25 16:07:04 crc kubenswrapper[4937]: I0225 16:07:04.334867 4937 scope.go:117] "RemoveContainer" containerID="dac4e5621cf576701033246fff5963bb6883a9ab86965d4de3dc82bafa022dc6" Feb 25 16:07:04 crc kubenswrapper[4937]: E0225 16:07:04.335894 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dac4e5621cf576701033246fff5963bb6883a9ab86965d4de3dc82bafa022dc6\": container with ID starting with dac4e5621cf576701033246fff5963bb6883a9ab86965d4de3dc82bafa022dc6 not found: ID does not exist" containerID="dac4e5621cf576701033246fff5963bb6883a9ab86965d4de3dc82bafa022dc6" Feb 25 16:07:04 crc kubenswrapper[4937]: I0225 16:07:04.335928 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac4e5621cf576701033246fff5963bb6883a9ab86965d4de3dc82bafa022dc6"} err="failed to get container status \"dac4e5621cf576701033246fff5963bb6883a9ab86965d4de3dc82bafa022dc6\": rpc error: code = NotFound desc = could not find container \"dac4e5621cf576701033246fff5963bb6883a9ab86965d4de3dc82bafa022dc6\": container with ID starting with dac4e5621cf576701033246fff5963bb6883a9ab86965d4de3dc82bafa022dc6 not found: ID does not exist" Feb 25 16:07:04 crc kubenswrapper[4937]: I0225 16:07:04.355073 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-6gjsq"] Feb 25 16:07:04 crc kubenswrapper[4937]: I0225 16:07:04.364632 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-6gjsq"] Feb 25 16:07:05 crc kubenswrapper[4937]: I0225 16:07:05.374774 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7417c402-f7b0-4e09-80e8-54ea12ec15a4" path="/var/lib/kubelet/pods/7417c402-f7b0-4e09-80e8-54ea12ec15a4/volumes" Feb 25 16:07:11 crc kubenswrapper[4937]: I0225 16:07:11.495075 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:07:11 crc kubenswrapper[4937]: I0225 16:07:11.495465 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:07:12 crc kubenswrapper[4937]: I0225 16:07:12.102183 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-7hxvd" Feb 25 16:07:12 crc kubenswrapper[4937]: I0225 16:07:12.102251 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-7hxvd" Feb 25 16:07:12 crc kubenswrapper[4937]: I0225 16:07:12.150301 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-7hxvd" Feb 25 16:07:12 crc kubenswrapper[4937]: I0225 16:07:12.401237 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-7hxvd" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.003967 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7"] Feb 25 16:07:14 crc kubenswrapper[4937]: E0225 16:07:14.004359 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7417c402-f7b0-4e09-80e8-54ea12ec15a4" containerName="registry-server" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.004381 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="7417c402-f7b0-4e09-80e8-54ea12ec15a4" containerName="registry-server" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.004671 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="7417c402-f7b0-4e09-80e8-54ea12ec15a4" containerName="registry-server" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.006271 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.019796 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-xmxkd" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.027798 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7"] Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.177506 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7671f573-d466-4764-9094-4cc7250e6d3d-util\") pod \"6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7\" (UID: \"7671f573-d466-4764-9094-4cc7250e6d3d\") " pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.177590 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xgbv\" (UniqueName: \"kubernetes.io/projected/7671f573-d466-4764-9094-4cc7250e6d3d-kube-api-access-6xgbv\") pod \"6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7\" (UID: \"7671f573-d466-4764-9094-4cc7250e6d3d\") " pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.177651 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7671f573-d466-4764-9094-4cc7250e6d3d-bundle\") pod \"6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7\" (UID: \"7671f573-d466-4764-9094-4cc7250e6d3d\") " pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.278727 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7671f573-d466-4764-9094-4cc7250e6d3d-util\") pod \"6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7\" (UID: \"7671f573-d466-4764-9094-4cc7250e6d3d\") " pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.278798 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xgbv\" (UniqueName: \"kubernetes.io/projected/7671f573-d466-4764-9094-4cc7250e6d3d-kube-api-access-6xgbv\") pod \"6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7\" (UID: \"7671f573-d466-4764-9094-4cc7250e6d3d\") " pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.278854 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7671f573-d466-4764-9094-4cc7250e6d3d-bundle\") pod \"6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7\" (UID: \"7671f573-d466-4764-9094-4cc7250e6d3d\") " pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.279585 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7671f573-d466-4764-9094-4cc7250e6d3d-bundle\") pod \"6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7\" (UID: \"7671f573-d466-4764-9094-4cc7250e6d3d\") " pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.279604 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7671f573-d466-4764-9094-4cc7250e6d3d-util\") pod \"6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7\" (UID: \"7671f573-d466-4764-9094-4cc7250e6d3d\") " pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.300805 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xgbv\" (UniqueName: \"kubernetes.io/projected/7671f573-d466-4764-9094-4cc7250e6d3d-kube-api-access-6xgbv\") pod \"6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7\" (UID: \"7671f573-d466-4764-9094-4cc7250e6d3d\") " pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.356938 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" Feb 25 16:07:14 crc kubenswrapper[4937]: I0225 16:07:14.824337 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7"] Feb 25 16:07:14 crc kubenswrapper[4937]: W0225 16:07:14.842982 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7671f573_d466_4764_9094_4cc7250e6d3d.slice/crio-66caefb95c027781a25275b995185ba195d7e10841c8919b6adab6f4f3f79a1b WatchSource:0}: Error finding container 66caefb95c027781a25275b995185ba195d7e10841c8919b6adab6f4f3f79a1b: Status 404 returned error can't find the container with id 66caefb95c027781a25275b995185ba195d7e10841c8919b6adab6f4f3f79a1b Feb 25 16:07:15 crc kubenswrapper[4937]: I0225 16:07:15.398273 4937 generic.go:334] "Generic (PLEG): container finished" podID="7671f573-d466-4764-9094-4cc7250e6d3d" containerID="1771079385f51d3750642bb034d560614fc086ac50c13e502ff7c7476795c06f" exitCode=0 Feb 25 16:07:15 crc kubenswrapper[4937]: I0225 16:07:15.398345 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" event={"ID":"7671f573-d466-4764-9094-4cc7250e6d3d","Type":"ContainerDied","Data":"1771079385f51d3750642bb034d560614fc086ac50c13e502ff7c7476795c06f"} Feb 25 16:07:15 crc kubenswrapper[4937]: I0225 16:07:15.398390 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" event={"ID":"7671f573-d466-4764-9094-4cc7250e6d3d","Type":"ContainerStarted","Data":"66caefb95c027781a25275b995185ba195d7e10841c8919b6adab6f4f3f79a1b"} Feb 25 16:07:16 crc kubenswrapper[4937]: I0225 16:07:16.417369 4937 generic.go:334] "Generic (PLEG): container finished" podID="7671f573-d466-4764-9094-4cc7250e6d3d" containerID="abf555c9d6337418a75ab7bcf6645b1a72ce5a998cfb6d9c17d9c7aed4bad3d1" exitCode=0 Feb 25 16:07:16 crc kubenswrapper[4937]: I0225 16:07:16.418393 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" event={"ID":"7671f573-d466-4764-9094-4cc7250e6d3d","Type":"ContainerDied","Data":"abf555c9d6337418a75ab7bcf6645b1a72ce5a998cfb6d9c17d9c7aed4bad3d1"} Feb 25 16:07:17 crc kubenswrapper[4937]: I0225 16:07:17.428182 4937 generic.go:334] "Generic (PLEG): container finished" podID="7671f573-d466-4764-9094-4cc7250e6d3d" containerID="469d564c9bfc61784490a098041805dfeb488f25892e3fe15995b1c119ca5923" exitCode=0 Feb 25 16:07:17 crc kubenswrapper[4937]: I0225 16:07:17.428281 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" event={"ID":"7671f573-d466-4764-9094-4cc7250e6d3d","Type":"ContainerDied","Data":"469d564c9bfc61784490a098041805dfeb488f25892e3fe15995b1c119ca5923"} Feb 25 16:07:19 crc kubenswrapper[4937]: I0225 16:07:19.653781 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" Feb 25 16:07:19 crc kubenswrapper[4937]: I0225 16:07:19.793146 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xgbv\" (UniqueName: \"kubernetes.io/projected/7671f573-d466-4764-9094-4cc7250e6d3d-kube-api-access-6xgbv\") pod \"7671f573-d466-4764-9094-4cc7250e6d3d\" (UID: \"7671f573-d466-4764-9094-4cc7250e6d3d\") " Feb 25 16:07:19 crc kubenswrapper[4937]: I0225 16:07:19.793404 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7671f573-d466-4764-9094-4cc7250e6d3d-bundle\") pod \"7671f573-d466-4764-9094-4cc7250e6d3d\" (UID: \"7671f573-d466-4764-9094-4cc7250e6d3d\") " Feb 25 16:07:19 crc kubenswrapper[4937]: I0225 16:07:19.793521 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7671f573-d466-4764-9094-4cc7250e6d3d-util\") pod \"7671f573-d466-4764-9094-4cc7250e6d3d\" (UID: \"7671f573-d466-4764-9094-4cc7250e6d3d\") " Feb 25 16:07:19 crc kubenswrapper[4937]: I0225 16:07:19.794470 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7671f573-d466-4764-9094-4cc7250e6d3d-bundle" (OuterVolumeSpecName: "bundle") pod "7671f573-d466-4764-9094-4cc7250e6d3d" (UID: "7671f573-d466-4764-9094-4cc7250e6d3d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:07:19 crc kubenswrapper[4937]: I0225 16:07:19.801909 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7671f573-d466-4764-9094-4cc7250e6d3d-kube-api-access-6xgbv" (OuterVolumeSpecName: "kube-api-access-6xgbv") pod "7671f573-d466-4764-9094-4cc7250e6d3d" (UID: "7671f573-d466-4764-9094-4cc7250e6d3d"). InnerVolumeSpecName "kube-api-access-6xgbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:07:19 crc kubenswrapper[4937]: I0225 16:07:19.814240 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7671f573-d466-4764-9094-4cc7250e6d3d-util" (OuterVolumeSpecName: "util") pod "7671f573-d466-4764-9094-4cc7250e6d3d" (UID: "7671f573-d466-4764-9094-4cc7250e6d3d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:07:19 crc kubenswrapper[4937]: I0225 16:07:19.895476 4937 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7671f573-d466-4764-9094-4cc7250e6d3d-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:07:19 crc kubenswrapper[4937]: I0225 16:07:19.895595 4937 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7671f573-d466-4764-9094-4cc7250e6d3d-util\") on node \"crc\" DevicePath \"\"" Feb 25 16:07:19 crc kubenswrapper[4937]: I0225 16:07:19.895614 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xgbv\" (UniqueName: \"kubernetes.io/projected/7671f573-d466-4764-9094-4cc7250e6d3d-kube-api-access-6xgbv\") on node \"crc\" DevicePath \"\"" Feb 25 16:07:20 crc kubenswrapper[4937]: I0225 16:07:20.513290 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" event={"ID":"7671f573-d466-4764-9094-4cc7250e6d3d","Type":"ContainerDied","Data":"66caefb95c027781a25275b995185ba195d7e10841c8919b6adab6f4f3f79a1b"} Feb 25 16:07:20 crc kubenswrapper[4937]: I0225 16:07:20.513341 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66caefb95c027781a25275b995185ba195d7e10841c8919b6adab6f4f3f79a1b" Feb 25 16:07:20 crc kubenswrapper[4937]: I0225 16:07:20.513369 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7" Feb 25 16:07:26 crc kubenswrapper[4937]: I0225 16:07:26.575977 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5f5c559654-md6zv"] Feb 25 16:07:26 crc kubenswrapper[4937]: E0225 16:07:26.576630 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7671f573-d466-4764-9094-4cc7250e6d3d" containerName="extract" Feb 25 16:07:26 crc kubenswrapper[4937]: I0225 16:07:26.576641 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="7671f573-d466-4764-9094-4cc7250e6d3d" containerName="extract" Feb 25 16:07:26 crc kubenswrapper[4937]: E0225 16:07:26.576651 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7671f573-d466-4764-9094-4cc7250e6d3d" containerName="pull" Feb 25 16:07:26 crc kubenswrapper[4937]: I0225 16:07:26.576657 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="7671f573-d466-4764-9094-4cc7250e6d3d" containerName="pull" Feb 25 16:07:26 crc kubenswrapper[4937]: E0225 16:07:26.576679 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7671f573-d466-4764-9094-4cc7250e6d3d" containerName="util" Feb 25 16:07:26 crc kubenswrapper[4937]: I0225 16:07:26.576684 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="7671f573-d466-4764-9094-4cc7250e6d3d" containerName="util" Feb 25 16:07:26 crc kubenswrapper[4937]: I0225 16:07:26.576787 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="7671f573-d466-4764-9094-4cc7250e6d3d" containerName="extract" Feb 25 16:07:26 crc kubenswrapper[4937]: I0225 16:07:26.577163 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5f5c559654-md6zv" Feb 25 16:07:26 crc kubenswrapper[4937]: I0225 16:07:26.580955 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-cs4fn" Feb 25 16:07:26 crc kubenswrapper[4937]: I0225 16:07:26.599829 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5f5c559654-md6zv"] Feb 25 16:07:26 crc kubenswrapper[4937]: I0225 16:07:26.692027 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t88x6\" (UniqueName: \"kubernetes.io/projected/380b8472-bb7f-421e-8a0a-7da8078b6ecc-kube-api-access-t88x6\") pod \"openstack-operator-controller-init-5f5c559654-md6zv\" (UID: \"380b8472-bb7f-421e-8a0a-7da8078b6ecc\") " pod="openstack-operators/openstack-operator-controller-init-5f5c559654-md6zv" Feb 25 16:07:26 crc kubenswrapper[4937]: I0225 16:07:26.793149 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t88x6\" (UniqueName: \"kubernetes.io/projected/380b8472-bb7f-421e-8a0a-7da8078b6ecc-kube-api-access-t88x6\") pod \"openstack-operator-controller-init-5f5c559654-md6zv\" (UID: \"380b8472-bb7f-421e-8a0a-7da8078b6ecc\") " pod="openstack-operators/openstack-operator-controller-init-5f5c559654-md6zv" Feb 25 16:07:26 crc kubenswrapper[4937]: I0225 16:07:26.816283 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t88x6\" (UniqueName: \"kubernetes.io/projected/380b8472-bb7f-421e-8a0a-7da8078b6ecc-kube-api-access-t88x6\") pod \"openstack-operator-controller-init-5f5c559654-md6zv\" (UID: \"380b8472-bb7f-421e-8a0a-7da8078b6ecc\") " pod="openstack-operators/openstack-operator-controller-init-5f5c559654-md6zv" Feb 25 16:07:26 crc kubenswrapper[4937]: I0225 16:07:26.891186 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5f5c559654-md6zv" Feb 25 16:07:27 crc kubenswrapper[4937]: I0225 16:07:27.145421 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5f5c559654-md6zv"] Feb 25 16:07:27 crc kubenswrapper[4937]: W0225 16:07:27.156501 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod380b8472_bb7f_421e_8a0a_7da8078b6ecc.slice/crio-5a8c0abdc77002be525b77773c5b4c13b9c8710dbd8ca894be5e17c149396388 WatchSource:0}: Error finding container 5a8c0abdc77002be525b77773c5b4c13b9c8710dbd8ca894be5e17c149396388: Status 404 returned error can't find the container with id 5a8c0abdc77002be525b77773c5b4c13b9c8710dbd8ca894be5e17c149396388 Feb 25 16:07:27 crc kubenswrapper[4937]: I0225 16:07:27.563370 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5f5c559654-md6zv" event={"ID":"380b8472-bb7f-421e-8a0a-7da8078b6ecc","Type":"ContainerStarted","Data":"5a8c0abdc77002be525b77773c5b4c13b9c8710dbd8ca894be5e17c149396388"} Feb 25 16:07:33 crc kubenswrapper[4937]: I0225 16:07:33.613460 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5f5c559654-md6zv" event={"ID":"380b8472-bb7f-421e-8a0a-7da8078b6ecc","Type":"ContainerStarted","Data":"2eea222d6d9a74d6cc09a758afb2b953761f5e8d06230554a9bb450e113dc83b"} Feb 25 16:07:33 crc kubenswrapper[4937]: I0225 16:07:33.613930 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5f5c559654-md6zv" Feb 25 16:07:33 crc kubenswrapper[4937]: I0225 16:07:33.645371 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5f5c559654-md6zv" podStartSLOduration=1.464092667 podStartE2EDuration="7.645356159s" podCreationTimestamp="2026-02-25 16:07:26 +0000 UTC" firstStartedPulling="2026-02-25 16:07:27.158244789 +0000 UTC m=+1298.171636679" lastFinishedPulling="2026-02-25 16:07:33.339508281 +0000 UTC m=+1304.352900171" observedRunningTime="2026-02-25 16:07:33.641636935 +0000 UTC m=+1304.655028825" watchObservedRunningTime="2026-02-25 16:07:33.645356159 +0000 UTC m=+1304.658748049" Feb 25 16:07:41 crc kubenswrapper[4937]: I0225 16:07:41.494446 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:07:41 crc kubenswrapper[4937]: I0225 16:07:41.495009 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:07:41 crc kubenswrapper[4937]: I0225 16:07:41.495059 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 16:07:41 crc kubenswrapper[4937]: I0225 16:07:41.495695 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82d7f39c6bdd0c324e2d3b37551824fca9f991542926e3b5f5cc2a5a3ef74dfb"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 16:07:41 crc kubenswrapper[4937]: I0225 16:07:41.495747 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://82d7f39c6bdd0c324e2d3b37551824fca9f991542926e3b5f5cc2a5a3ef74dfb" gracePeriod=600 Feb 25 16:07:41 crc kubenswrapper[4937]: I0225 16:07:41.679462 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="82d7f39c6bdd0c324e2d3b37551824fca9f991542926e3b5f5cc2a5a3ef74dfb" exitCode=0 Feb 25 16:07:41 crc kubenswrapper[4937]: I0225 16:07:41.679595 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"82d7f39c6bdd0c324e2d3b37551824fca9f991542926e3b5f5cc2a5a3ef74dfb"} Feb 25 16:07:41 crc kubenswrapper[4937]: I0225 16:07:41.680122 4937 scope.go:117] "RemoveContainer" containerID="a3de247f04ff3abf939866313cfef1da7c2e6ae7d14d3da3ecda7ba81bfc35f7" Feb 25 16:07:42 crc kubenswrapper[4937]: I0225 16:07:42.690682 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"710133016a8fda213d788ff3f0a0661f137f661d0c6764233454878cf67045e1"} Feb 25 16:07:46 crc kubenswrapper[4937]: I0225 16:07:46.895061 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5f5c559654-md6zv" Feb 25 16:08:00 crc kubenswrapper[4937]: I0225 16:08:00.154799 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533928-xf6hf"] Feb 25 16:08:00 crc kubenswrapper[4937]: I0225 16:08:00.155944 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533928-xf6hf" Feb 25 16:08:00 crc kubenswrapper[4937]: I0225 16:08:00.163542 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:08:00 crc kubenswrapper[4937]: I0225 16:08:00.163704 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:08:00 crc kubenswrapper[4937]: I0225 16:08:00.163757 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:08:00 crc kubenswrapper[4937]: I0225 16:08:00.169696 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533928-xf6hf"] Feb 25 16:08:00 crc kubenswrapper[4937]: I0225 16:08:00.202677 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjj7s\" (UniqueName: \"kubernetes.io/projected/934150ac-0fe5-4dad-ba78-7fdc77f53fb5-kube-api-access-xjj7s\") pod \"auto-csr-approver-29533928-xf6hf\" (UID: \"934150ac-0fe5-4dad-ba78-7fdc77f53fb5\") " pod="openshift-infra/auto-csr-approver-29533928-xf6hf" Feb 25 16:08:00 crc kubenswrapper[4937]: I0225 16:08:00.304606 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjj7s\" (UniqueName: \"kubernetes.io/projected/934150ac-0fe5-4dad-ba78-7fdc77f53fb5-kube-api-access-xjj7s\") pod \"auto-csr-approver-29533928-xf6hf\" (UID: \"934150ac-0fe5-4dad-ba78-7fdc77f53fb5\") " pod="openshift-infra/auto-csr-approver-29533928-xf6hf" Feb 25 16:08:00 crc kubenswrapper[4937]: I0225 16:08:00.326340 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjj7s\" (UniqueName: \"kubernetes.io/projected/934150ac-0fe5-4dad-ba78-7fdc77f53fb5-kube-api-access-xjj7s\") pod \"auto-csr-approver-29533928-xf6hf\" (UID: \"934150ac-0fe5-4dad-ba78-7fdc77f53fb5\") " pod="openshift-infra/auto-csr-approver-29533928-xf6hf" Feb 25 16:08:00 crc kubenswrapper[4937]: I0225 16:08:00.481686 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533928-xf6hf" Feb 25 16:08:00 crc kubenswrapper[4937]: I0225 16:08:00.897229 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533928-xf6hf"] Feb 25 16:08:01 crc kubenswrapper[4937]: I0225 16:08:01.840744 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533928-xf6hf" event={"ID":"934150ac-0fe5-4dad-ba78-7fdc77f53fb5","Type":"ContainerStarted","Data":"616a8b050a64348b287e5b311af116d31da4b9f2ccb16ebd78c4a88c700ec79c"} Feb 25 16:08:02 crc kubenswrapper[4937]: I0225 16:08:02.848405 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533928-xf6hf" event={"ID":"934150ac-0fe5-4dad-ba78-7fdc77f53fb5","Type":"ContainerStarted","Data":"8485e4e6c778f1b465c642cc0a1962e713cf730d59aea13a70b205613b03f028"} Feb 25 16:08:03 crc kubenswrapper[4937]: I0225 16:08:03.856807 4937 generic.go:334] "Generic (PLEG): container finished" podID="934150ac-0fe5-4dad-ba78-7fdc77f53fb5" containerID="8485e4e6c778f1b465c642cc0a1962e713cf730d59aea13a70b205613b03f028" exitCode=0 Feb 25 16:08:03 crc kubenswrapper[4937]: I0225 16:08:03.856875 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533928-xf6hf" event={"ID":"934150ac-0fe5-4dad-ba78-7fdc77f53fb5","Type":"ContainerDied","Data":"8485e4e6c778f1b465c642cc0a1962e713cf730d59aea13a70b205613b03f028"} Feb 25 16:08:04 crc kubenswrapper[4937]: I0225 16:08:04.103483 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533928-xf6hf" Feb 25 16:08:04 crc kubenswrapper[4937]: I0225 16:08:04.257555 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjj7s\" (UniqueName: \"kubernetes.io/projected/934150ac-0fe5-4dad-ba78-7fdc77f53fb5-kube-api-access-xjj7s\") pod \"934150ac-0fe5-4dad-ba78-7fdc77f53fb5\" (UID: \"934150ac-0fe5-4dad-ba78-7fdc77f53fb5\") " Feb 25 16:08:04 crc kubenswrapper[4937]: I0225 16:08:04.266015 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934150ac-0fe5-4dad-ba78-7fdc77f53fb5-kube-api-access-xjj7s" (OuterVolumeSpecName: "kube-api-access-xjj7s") pod "934150ac-0fe5-4dad-ba78-7fdc77f53fb5" (UID: "934150ac-0fe5-4dad-ba78-7fdc77f53fb5"). InnerVolumeSpecName "kube-api-access-xjj7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:08:04 crc kubenswrapper[4937]: I0225 16:08:04.359544 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjj7s\" (UniqueName: \"kubernetes.io/projected/934150ac-0fe5-4dad-ba78-7fdc77f53fb5-kube-api-access-xjj7s\") on node \"crc\" DevicePath \"\"" Feb 25 16:08:04 crc kubenswrapper[4937]: I0225 16:08:04.866401 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533928-xf6hf" event={"ID":"934150ac-0fe5-4dad-ba78-7fdc77f53fb5","Type":"ContainerDied","Data":"616a8b050a64348b287e5b311af116d31da4b9f2ccb16ebd78c4a88c700ec79c"} Feb 25 16:08:04 crc kubenswrapper[4937]: I0225 16:08:04.866440 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="616a8b050a64348b287e5b311af116d31da4b9f2ccb16ebd78c4a88c700ec79c" Feb 25 16:08:04 crc kubenswrapper[4937]: I0225 16:08:04.866570 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533928-xf6hf" Feb 25 16:08:05 crc kubenswrapper[4937]: I0225 16:08:05.176941 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533922-nkq2g"] Feb 25 16:08:05 crc kubenswrapper[4937]: I0225 16:08:05.181208 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533922-nkq2g"] Feb 25 16:08:05 crc kubenswrapper[4937]: I0225 16:08:05.374516 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d6a5911-b7a5-4523-bead-543b0a0ccdcc" path="/var/lib/kubelet/pods/2d6a5911-b7a5-4523-bead-543b0a0ccdcc/volumes" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.570027 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-72hwb"] Feb 25 16:08:11 crc kubenswrapper[4937]: E0225 16:08:11.570976 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934150ac-0fe5-4dad-ba78-7fdc77f53fb5" containerName="oc" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.571002 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="934150ac-0fe5-4dad-ba78-7fdc77f53fb5" containerName="oc" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.571272 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="934150ac-0fe5-4dad-ba78-7fdc77f53fb5" containerName="oc" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.572108 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-72hwb" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.575379 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-c6jdx" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.580845 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lkw74"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.581919 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lkw74" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.585007 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-79h8l" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.590321 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-72hwb"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.594585 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-95dsz"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.595401 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-95dsz" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.599317 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-rbk7w" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.603208 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-q92r7"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.604033 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-q92r7" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.608805 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-8247l" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.614914 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lkw74"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.647003 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-k7z4s"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.648145 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-k7z4s" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.649951 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vhrpf" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.658390 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m8vq\" (UniqueName: \"kubernetes.io/projected/1e2c3857-1279-466f-8da3-ea1f5cf13893-kube-api-access-9m8vq\") pod \"barbican-operator-controller-manager-868647ff47-72hwb\" (UID: \"1e2c3857-1279-466f-8da3-ea1f5cf13893\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-72hwb" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.658561 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fp7j\" (UniqueName: \"kubernetes.io/projected/079024f7-46b2-46fa-b96b-e4dca470cb4b-kube-api-access-7fp7j\") pod \"glance-operator-controller-manager-784b5bb6c5-q92r7\" (UID: \"079024f7-46b2-46fa-b96b-e4dca470cb4b\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-q92r7" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.658607 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrb94\" (UniqueName: \"kubernetes.io/projected/8132d735-0341-43be-93de-730c15511083-kube-api-access-lrb94\") pod \"heat-operator-controller-manager-69f49c598c-k7z4s\" (UID: \"8132d735-0341-43be-93de-730c15511083\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-k7z4s" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.658625 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjvzl\" (UniqueName: \"kubernetes.io/projected/806dde6d-ac75-47d7-98e2-0ba5959614a3-kube-api-access-rjvzl\") pod \"cinder-operator-controller-manager-55d77d7b5c-lkw74\" (UID: \"806dde6d-ac75-47d7-98e2-0ba5959614a3\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lkw74" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.658640 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqvtt\" (UniqueName: \"kubernetes.io/projected/5c7c6408-d0c4-42ea-ae7b-e10b49e13355-kube-api-access-pqvtt\") pod \"designate-operator-controller-manager-6d8bf5c495-95dsz\" (UID: \"5c7c6408-d0c4-42ea-ae7b-e10b49e13355\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-95dsz" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.673941 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-95dsz"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.681913 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-k7z4s"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.710143 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-q92r7"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.741397 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-lnw2m"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.742243 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-lnw2m" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.745052 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-lhkkw" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.768111 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fp7j\" (UniqueName: \"kubernetes.io/projected/079024f7-46b2-46fa-b96b-e4dca470cb4b-kube-api-access-7fp7j\") pod \"glance-operator-controller-manager-784b5bb6c5-q92r7\" (UID: \"079024f7-46b2-46fa-b96b-e4dca470cb4b\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-q92r7" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.768202 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrb94\" (UniqueName: \"kubernetes.io/projected/8132d735-0341-43be-93de-730c15511083-kube-api-access-lrb94\") pod \"heat-operator-controller-manager-69f49c598c-k7z4s\" (UID: \"8132d735-0341-43be-93de-730c15511083\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-k7z4s" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.768228 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjvzl\" (UniqueName: \"kubernetes.io/projected/806dde6d-ac75-47d7-98e2-0ba5959614a3-kube-api-access-rjvzl\") pod \"cinder-operator-controller-manager-55d77d7b5c-lkw74\" (UID: \"806dde6d-ac75-47d7-98e2-0ba5959614a3\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lkw74" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.768253 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqvtt\" (UniqueName: \"kubernetes.io/projected/5c7c6408-d0c4-42ea-ae7b-e10b49e13355-kube-api-access-pqvtt\") pod \"designate-operator-controller-manager-6d8bf5c495-95dsz\" (UID: \"5c7c6408-d0c4-42ea-ae7b-e10b49e13355\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-95dsz" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.768454 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m8vq\" (UniqueName: \"kubernetes.io/projected/1e2c3857-1279-466f-8da3-ea1f5cf13893-kube-api-access-9m8vq\") pod \"barbican-operator-controller-manager-868647ff47-72hwb\" (UID: \"1e2c3857-1279-466f-8da3-ea1f5cf13893\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-72hwb" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.771559 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntwv6\" (UniqueName: \"kubernetes.io/projected/7b01abed-0e59-495b-8b5e-2229c8d3215f-kube-api-access-ntwv6\") pod \"horizon-operator-controller-manager-5b9b8895d5-lnw2m\" (UID: \"7b01abed-0e59-495b-8b5e-2229c8d3215f\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-lnw2m" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.787088 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-lnw2m"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.794097 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.809263 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.815244 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrb94\" (UniqueName: \"kubernetes.io/projected/8132d735-0341-43be-93de-730c15511083-kube-api-access-lrb94\") pod \"heat-operator-controller-manager-69f49c598c-k7z4s\" (UID: \"8132d735-0341-43be-93de-730c15511083\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-k7z4s" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.815590 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqvtt\" (UniqueName: \"kubernetes.io/projected/5c7c6408-d0c4-42ea-ae7b-e10b49e13355-kube-api-access-pqvtt\") pod \"designate-operator-controller-manager-6d8bf5c495-95dsz\" (UID: \"5c7c6408-d0c4-42ea-ae7b-e10b49e13355\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-95dsz" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.818093 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-958h6" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.818268 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.818274 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjvzl\" (UniqueName: \"kubernetes.io/projected/806dde6d-ac75-47d7-98e2-0ba5959614a3-kube-api-access-rjvzl\") pod \"cinder-operator-controller-manager-55d77d7b5c-lkw74\" (UID: \"806dde6d-ac75-47d7-98e2-0ba5959614a3\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lkw74" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.818850 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m8vq\" (UniqueName: \"kubernetes.io/projected/1e2c3857-1279-466f-8da3-ea1f5cf13893-kube-api-access-9m8vq\") pod \"barbican-operator-controller-manager-868647ff47-72hwb\" (UID: \"1e2c3857-1279-466f-8da3-ea1f5cf13893\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-72hwb" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.830048 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fp7j\" (UniqueName: \"kubernetes.io/projected/079024f7-46b2-46fa-b96b-e4dca470cb4b-kube-api-access-7fp7j\") pod \"glance-operator-controller-manager-784b5bb6c5-q92r7\" (UID: \"079024f7-46b2-46fa-b96b-e4dca470cb4b\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-q92r7" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.846458 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.859564 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-xhxm2"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.860722 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-xhxm2" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.864852 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-x9mxj" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.877936 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rbg2\" (UniqueName: \"kubernetes.io/projected/88ef567f-e68d-47aa-9788-4307003a77a0-kube-api-access-4rbg2\") pod \"infra-operator-controller-manager-79d975b745-vjpzc\" (UID: \"88ef567f-e68d-47aa-9788-4307003a77a0\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.878002 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntwv6\" (UniqueName: \"kubernetes.io/projected/7b01abed-0e59-495b-8b5e-2229c8d3215f-kube-api-access-ntwv6\") pod \"horizon-operator-controller-manager-5b9b8895d5-lnw2m\" (UID: \"7b01abed-0e59-495b-8b5e-2229c8d3215f\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-lnw2m" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.878027 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert\") pod \"infra-operator-controller-manager-79d975b745-vjpzc\" (UID: \"88ef567f-e68d-47aa-9788-4307003a77a0\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.878048 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnvwh\" (UniqueName: \"kubernetes.io/projected/7f4f0820-dd56-4d0b-aa5e-70dcab23e568-kube-api-access-qnvwh\") pod \"ironic-operator-controller-manager-554564d7fc-xhxm2\" (UID: \"7f4f0820-dd56-4d0b-aa5e-70dcab23e568\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-xhxm2" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.878148 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-sw29j"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.878906 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-sw29j" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.885308 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qrn7l" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.891906 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-xhxm2"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.902992 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-72hwb" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.914795 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-sw29j"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.915822 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lkw74" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.929932 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntwv6\" (UniqueName: \"kubernetes.io/projected/7b01abed-0e59-495b-8b5e-2229c8d3215f-kube-api-access-ntwv6\") pod \"horizon-operator-controller-manager-5b9b8895d5-lnw2m\" (UID: \"7b01abed-0e59-495b-8b5e-2229c8d3215f\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-lnw2m" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.938081 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-95dsz" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.942628 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-g82nw"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.943436 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-g82nw" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.945534 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-c5n89" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.951097 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-q92r7" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.954354 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-g82nw"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.958992 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-ntf28"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.959814 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-ntf28" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.961388 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-ktvfp" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.975965 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-k7z4s" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.979081 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68fsl\" (UniqueName: \"kubernetes.io/projected/18df78fd-5382-4716-9708-4e669508c898-kube-api-access-68fsl\") pod \"keystone-operator-controller-manager-b4d948c87-sw29j\" (UID: \"18df78fd-5382-4716-9708-4e669508c898\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-sw29j" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.979114 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert\") pod \"infra-operator-controller-manager-79d975b745-vjpzc\" (UID: \"88ef567f-e68d-47aa-9788-4307003a77a0\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.979144 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnvwh\" (UniqueName: \"kubernetes.io/projected/7f4f0820-dd56-4d0b-aa5e-70dcab23e568-kube-api-access-qnvwh\") pod \"ironic-operator-controller-manager-554564d7fc-xhxm2\" (UID: \"7f4f0820-dd56-4d0b-aa5e-70dcab23e568\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-xhxm2" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.979196 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjghp\" (UniqueName: \"kubernetes.io/projected/b688ff11-a838-4c26-90bd-974c871f4d44-kube-api-access-cjghp\") pod \"mariadb-operator-controller-manager-6994f66f48-ntf28\" (UID: \"b688ff11-a838-4c26-90bd-974c871f4d44\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-ntf28" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.979239 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rbg2\" (UniqueName: \"kubernetes.io/projected/88ef567f-e68d-47aa-9788-4307003a77a0-kube-api-access-4rbg2\") pod \"infra-operator-controller-manager-79d975b745-vjpzc\" (UID: \"88ef567f-e68d-47aa-9788-4307003a77a0\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.979283 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8fw8\" (UniqueName: \"kubernetes.io/projected/6cb3892f-a950-4dc7-9b9b-0db2876c569d-kube-api-access-v8fw8\") pod \"manila-operator-controller-manager-67d996989d-g82nw\" (UID: \"6cb3892f-a950-4dc7-9b9b-0db2876c569d\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-g82nw" Feb 25 16:08:11 crc kubenswrapper[4937]: E0225 16:08:11.979410 4937 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 16:08:11 crc kubenswrapper[4937]: E0225 16:08:11.979456 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert podName:88ef567f-e68d-47aa-9788-4307003a77a0 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:12.479437776 +0000 UTC m=+1343.492829666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert") pod "infra-operator-controller-manager-79d975b745-vjpzc" (UID: "88ef567f-e68d-47aa-9788-4307003a77a0") : secret "infra-operator-webhook-server-cert" not found Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.988615 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-ntf28"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.994215 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-bz4hc"] Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.995348 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bz4hc" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.997204 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rbg2\" (UniqueName: \"kubernetes.io/projected/88ef567f-e68d-47aa-9788-4307003a77a0-kube-api-access-4rbg2\") pod \"infra-operator-controller-manager-79d975b745-vjpzc\" (UID: \"88ef567f-e68d-47aa-9788-4307003a77a0\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:08:11 crc kubenswrapper[4937]: I0225 16:08:11.999174 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-jxtft" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.000465 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnvwh\" (UniqueName: \"kubernetes.io/projected/7f4f0820-dd56-4d0b-aa5e-70dcab23e568-kube-api-access-qnvwh\") pod \"ironic-operator-controller-manager-554564d7fc-xhxm2\" (UID: \"7f4f0820-dd56-4d0b-aa5e-70dcab23e568\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-xhxm2" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.000527 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.004840 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.007502 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-tq6fb" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.013283 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.014249 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.015827 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-8mf42" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.018105 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-bz4hc"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.025272 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.026251 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.028181 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.028373 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-jcnvz" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.031885 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-vt5mw"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.032544 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vt5mw" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.033809 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-s52lh" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.044070 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.065004 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.080182 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwjhk\" (UniqueName: \"kubernetes.io/projected/4e98f637-2524-43db-9b27-4bd68ae19bf4-kube-api-access-gwjhk\") pod \"octavia-operator-controller-manager-659dc6bbfc-vh5zk\" (UID: \"4e98f637-2524-43db-9b27-4bd68ae19bf4\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.080236 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68fsl\" (UniqueName: \"kubernetes.io/projected/18df78fd-5382-4716-9708-4e669508c898-kube-api-access-68fsl\") pod \"keystone-operator-controller-manager-b4d948c87-sw29j\" (UID: \"18df78fd-5382-4716-9708-4e669508c898\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-sw29j" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.080311 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjghp\" (UniqueName: \"kubernetes.io/projected/b688ff11-a838-4c26-90bd-974c871f4d44-kube-api-access-cjghp\") pod \"mariadb-operator-controller-manager-6994f66f48-ntf28\" (UID: \"b688ff11-a838-4c26-90bd-974c871f4d44\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-ntf28" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.080334 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkj79\" (UniqueName: \"kubernetes.io/projected/e6bcab89-8beb-4879-8596-3a24805bd835-kube-api-access-lkj79\") pod \"ovn-operator-controller-manager-5955d8c787-vt5mw\" (UID: \"e6bcab89-8beb-4879-8596-3a24805bd835\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vt5mw" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.080361 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bvqx\" (UniqueName: \"kubernetes.io/projected/23514cd7-1535-4c0a-a090-68c39654dad2-kube-api-access-9bvqx\") pod \"neutron-operator-controller-manager-6bd4687957-bz4hc\" (UID: \"23514cd7-1535-4c0a-a090-68c39654dad2\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bz4hc" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.080385 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84tx9\" (UniqueName: \"kubernetes.io/projected/42c84a2f-b585-49c5-adb6-fb83ffecef77-kube-api-access-84tx9\") pod \"nova-operator-controller-manager-567668f5cf-n64lk\" (UID: \"42c84a2f-b585-49c5-adb6-fb83ffecef77\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.080423 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z82gc\" (UniqueName: \"kubernetes.io/projected/00b4788a-4566-469f-8731-51700725fea0-kube-api-access-z82gc\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9\" (UID: \"00b4788a-4566-469f-8731-51700725fea0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.080460 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9\" (UID: \"00b4788a-4566-469f-8731-51700725fea0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.080499 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8fw8\" (UniqueName: \"kubernetes.io/projected/6cb3892f-a950-4dc7-9b9b-0db2876c569d-kube-api-access-v8fw8\") pod \"manila-operator-controller-manager-67d996989d-g82nw\" (UID: \"6cb3892f-a950-4dc7-9b9b-0db2876c569d\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-g82nw" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.097309 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8fw8\" (UniqueName: \"kubernetes.io/projected/6cb3892f-a950-4dc7-9b9b-0db2876c569d-kube-api-access-v8fw8\") pod \"manila-operator-controller-manager-67d996989d-g82nw\" (UID: \"6cb3892f-a950-4dc7-9b9b-0db2876c569d\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-g82nw" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.097410 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjghp\" (UniqueName: \"kubernetes.io/projected/b688ff11-a838-4c26-90bd-974c871f4d44-kube-api-access-cjghp\") pod \"mariadb-operator-controller-manager-6994f66f48-ntf28\" (UID: \"b688ff11-a838-4c26-90bd-974c871f4d44\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-ntf28" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.097968 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68fsl\" (UniqueName: \"kubernetes.io/projected/18df78fd-5382-4716-9708-4e669508c898-kube-api-access-68fsl\") pod \"keystone-operator-controller-manager-b4d948c87-sw29j\" (UID: \"18df78fd-5382-4716-9708-4e669508c898\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-sw29j" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.099588 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.100526 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.105172 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.110470 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-vt5mw"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.110938 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-lnw2m" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.129089 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.130500 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-wqkqf" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.152582 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.183170 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bvqx\" (UniqueName: \"kubernetes.io/projected/23514cd7-1535-4c0a-a090-68c39654dad2-kube-api-access-9bvqx\") pod \"neutron-operator-controller-manager-6bd4687957-bz4hc\" (UID: \"23514cd7-1535-4c0a-a090-68c39654dad2\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bz4hc" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.183218 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpl6s\" (UniqueName: \"kubernetes.io/projected/b8448aa3-7cd0-4732-ad80-99fbefc125a6-kube-api-access-qpl6s\") pod \"placement-operator-controller-manager-8497b45c89-z2bqs\" (UID: \"b8448aa3-7cd0-4732-ad80-99fbefc125a6\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.183239 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84tx9\" (UniqueName: \"kubernetes.io/projected/42c84a2f-b585-49c5-adb6-fb83ffecef77-kube-api-access-84tx9\") pod \"nova-operator-controller-manager-567668f5cf-n64lk\" (UID: \"42c84a2f-b585-49c5-adb6-fb83ffecef77\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.183269 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z82gc\" (UniqueName: \"kubernetes.io/projected/00b4788a-4566-469f-8731-51700725fea0-kube-api-access-z82gc\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9\" (UID: \"00b4788a-4566-469f-8731-51700725fea0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.183297 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9\" (UID: \"00b4788a-4566-469f-8731-51700725fea0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.183327 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwjhk\" (UniqueName: \"kubernetes.io/projected/4e98f637-2524-43db-9b27-4bd68ae19bf4-kube-api-access-gwjhk\") pod \"octavia-operator-controller-manager-659dc6bbfc-vh5zk\" (UID: \"4e98f637-2524-43db-9b27-4bd68ae19bf4\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.183393 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkj79\" (UniqueName: \"kubernetes.io/projected/e6bcab89-8beb-4879-8596-3a24805bd835-kube-api-access-lkj79\") pod \"ovn-operator-controller-manager-5955d8c787-vt5mw\" (UID: \"e6bcab89-8beb-4879-8596-3a24805bd835\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vt5mw" Feb 25 16:08:12 crc kubenswrapper[4937]: E0225 16:08:12.184021 4937 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 16:08:12 crc kubenswrapper[4937]: E0225 16:08:12.184060 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert podName:00b4788a-4566-469f-8731-51700725fea0 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:12.684047809 +0000 UTC m=+1343.697439699 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" (UID: "00b4788a-4566-469f-8731-51700725fea0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.189939 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.189980 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.190826 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.191379 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.198752 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-4wmvh" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.200698 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.202273 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-88g2l" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.208165 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bvqx\" (UniqueName: \"kubernetes.io/projected/23514cd7-1535-4c0a-a090-68c39654dad2-kube-api-access-9bvqx\") pod \"neutron-operator-controller-manager-6bd4687957-bz4hc\" (UID: \"23514cd7-1535-4c0a-a090-68c39654dad2\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bz4hc" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.209003 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-xhxm2" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.209170 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z82gc\" (UniqueName: \"kubernetes.io/projected/00b4788a-4566-469f-8731-51700725fea0-kube-api-access-z82gc\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9\" (UID: \"00b4788a-4566-469f-8731-51700725fea0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.210192 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkj79\" (UniqueName: \"kubernetes.io/projected/e6bcab89-8beb-4879-8596-3a24805bd835-kube-api-access-lkj79\") pod \"ovn-operator-controller-manager-5955d8c787-vt5mw\" (UID: \"e6bcab89-8beb-4879-8596-3a24805bd835\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vt5mw" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.210987 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84tx9\" (UniqueName: \"kubernetes.io/projected/42c84a2f-b585-49c5-adb6-fb83ffecef77-kube-api-access-84tx9\") pod \"nova-operator-controller-manager-567668f5cf-n64lk\" (UID: \"42c84a2f-b585-49c5-adb6-fb83ffecef77\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.214572 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwjhk\" (UniqueName: \"kubernetes.io/projected/4e98f637-2524-43db-9b27-4bd68ae19bf4-kube-api-access-gwjhk\") pod \"octavia-operator-controller-manager-659dc6bbfc-vh5zk\" (UID: \"4e98f637-2524-43db-9b27-4bd68ae19bf4\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.286077 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpl6s\" (UniqueName: \"kubernetes.io/projected/b8448aa3-7cd0-4732-ad80-99fbefc125a6-kube-api-access-qpl6s\") pod \"placement-operator-controller-manager-8497b45c89-z2bqs\" (UID: \"b8448aa3-7cd0-4732-ad80-99fbefc125a6\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.286136 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9n7p\" (UniqueName: \"kubernetes.io/projected/eefbad00-59b6-4e7c-b056-ba07663a665f-kube-api-access-d9n7p\") pod \"telemetry-operator-controller-manager-78747bd5c7-dtngf\" (UID: \"eefbad00-59b6-4e7c-b056-ba07663a665f\") " pod="openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.286179 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmw9f\" (UniqueName: \"kubernetes.io/projected/2dde13f7-ba29-4c24-94e0-052d622fe88c-kube-api-access-vmw9f\") pod \"swift-operator-controller-manager-68f46476f-s8x4b\" (UID: \"2dde13f7-ba29-4c24-94e0-052d622fe88c\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.296422 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-sw29j" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.307213 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-g82nw" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.316788 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-mtqtq"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.318419 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mtqtq" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.322020 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-ntf28" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.326741 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qb9p6" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.330962 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-mtqtq"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.331813 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bz4hc" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.342853 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.349403 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpl6s\" (UniqueName: \"kubernetes.io/projected/b8448aa3-7cd0-4732-ad80-99fbefc125a6-kube-api-access-qpl6s\") pod \"placement-operator-controller-manager-8497b45c89-z2bqs\" (UID: \"b8448aa3-7cd0-4732-ad80-99fbefc125a6\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.360955 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.361436 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.362304 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.369255 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-b4d6d" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.372614 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.387461 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9px4\" (UniqueName: \"kubernetes.io/projected/b8a9d073-1b33-4184-8727-28c957c96e5f-kube-api-access-g9px4\") pod \"test-operator-controller-manager-5dc6794d5b-mtqtq\" (UID: \"b8a9d073-1b33-4184-8727-28c957c96e5f\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mtqtq" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.387571 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9n7p\" (UniqueName: \"kubernetes.io/projected/eefbad00-59b6-4e7c-b056-ba07663a665f-kube-api-access-d9n7p\") pod \"telemetry-operator-controller-manager-78747bd5c7-dtngf\" (UID: \"eefbad00-59b6-4e7c-b056-ba07663a665f\") " pod="openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.387620 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmw9f\" (UniqueName: \"kubernetes.io/projected/2dde13f7-ba29-4c24-94e0-052d622fe88c-kube-api-access-vmw9f\") pod \"swift-operator-controller-manager-68f46476f-s8x4b\" (UID: \"2dde13f7-ba29-4c24-94e0-052d622fe88c\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.387654 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjghw\" (UniqueName: \"kubernetes.io/projected/5d73c9f2-ead1-410a-ad35-16b7ba251daa-kube-api-access-fjghw\") pod \"watcher-operator-controller-manager-bccc79885-pkrhd\" (UID: \"5d73c9f2-ead1-410a-ad35-16b7ba251daa\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.401633 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.402713 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.407782 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.408085 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nv5sw" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.408228 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.408284 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9n7p\" (UniqueName: \"kubernetes.io/projected/eefbad00-59b6-4e7c-b056-ba07663a665f-kube-api-access-d9n7p\") pod \"telemetry-operator-controller-manager-78747bd5c7-dtngf\" (UID: \"eefbad00-59b6-4e7c-b056-ba07663a665f\") " pod="openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.408311 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.412760 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmw9f\" (UniqueName: \"kubernetes.io/projected/2dde13f7-ba29-4c24-94e0-052d622fe88c-kube-api-access-vmw9f\") pod \"swift-operator-controller-manager-68f46476f-s8x4b\" (UID: \"2dde13f7-ba29-4c24-94e0-052d622fe88c\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.420464 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hmm4d"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.421588 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hmm4d" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.423982 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-sgs7d" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.432383 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hmm4d"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.434453 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vt5mw" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.470877 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.489184 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9px4\" (UniqueName: \"kubernetes.io/projected/b8a9d073-1b33-4184-8727-28c957c96e5f-kube-api-access-g9px4\") pod \"test-operator-controller-manager-5dc6794d5b-mtqtq\" (UID: \"b8a9d073-1b33-4184-8727-28c957c96e5f\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mtqtq" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.489241 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87mx6\" (UniqueName: \"kubernetes.io/projected/c21d7933-3e35-48d9-8946-5ffdcc7a42bf-kube-api-access-87mx6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hmm4d\" (UID: \"c21d7933-3e35-48d9-8946-5ffdcc7a42bf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hmm4d" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.489264 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrh5j\" (UniqueName: \"kubernetes.io/projected/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-kube-api-access-rrh5j\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.489317 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjghw\" (UniqueName: \"kubernetes.io/projected/5d73c9f2-ead1-410a-ad35-16b7ba251daa-kube-api-access-fjghw\") pod \"watcher-operator-controller-manager-bccc79885-pkrhd\" (UID: \"5d73c9f2-ead1-410a-ad35-16b7ba251daa\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.489360 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.489387 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert\") pod \"infra-operator-controller-manager-79d975b745-vjpzc\" (UID: \"88ef567f-e68d-47aa-9788-4307003a77a0\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.489419 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:12 crc kubenswrapper[4937]: E0225 16:08:12.489952 4937 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 16:08:12 crc kubenswrapper[4937]: E0225 16:08:12.489993 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert podName:88ef567f-e68d-47aa-9788-4307003a77a0 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:13.489978538 +0000 UTC m=+1344.503370428 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert") pod "infra-operator-controller-manager-79d975b745-vjpzc" (UID: "88ef567f-e68d-47aa-9788-4307003a77a0") : secret "infra-operator-webhook-server-cert" not found Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.504614 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjghw\" (UniqueName: \"kubernetes.io/projected/5d73c9f2-ead1-410a-ad35-16b7ba251daa-kube-api-access-fjghw\") pod \"watcher-operator-controller-manager-bccc79885-pkrhd\" (UID: \"5d73c9f2-ead1-410a-ad35-16b7ba251daa\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.517114 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9px4\" (UniqueName: \"kubernetes.io/projected/b8a9d073-1b33-4184-8727-28c957c96e5f-kube-api-access-g9px4\") pod \"test-operator-controller-manager-5dc6794d5b-mtqtq\" (UID: \"b8a9d073-1b33-4184-8727-28c957c96e5f\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mtqtq" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.543825 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.550090 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-72hwb"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.556892 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.558268 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lkw74"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.592664 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.592752 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.592802 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87mx6\" (UniqueName: \"kubernetes.io/projected/c21d7933-3e35-48d9-8946-5ffdcc7a42bf-kube-api-access-87mx6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hmm4d\" (UID: \"c21d7933-3e35-48d9-8946-5ffdcc7a42bf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hmm4d" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.592824 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrh5j\" (UniqueName: \"kubernetes.io/projected/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-kube-api-access-rrh5j\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:12 crc kubenswrapper[4937]: E0225 16:08:12.593621 4937 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 16:08:12 crc kubenswrapper[4937]: E0225 16:08:12.593716 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs podName:2007fabb-e6dd-4713-823d-f6a8a3cd41f1 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:13.093694315 +0000 UTC m=+1344.107086205 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs") pod "openstack-operator-controller-manager-fbcb9db89-8spmv" (UID: "2007fabb-e6dd-4713-823d-f6a8a3cd41f1") : secret "webhook-server-cert" not found Feb 25 16:08:12 crc kubenswrapper[4937]: E0225 16:08:12.593636 4937 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 16:08:12 crc kubenswrapper[4937]: E0225 16:08:12.593852 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs podName:2007fabb-e6dd-4713-823d-f6a8a3cd41f1 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:13.093834628 +0000 UTC m=+1344.107226598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs") pod "openstack-operator-controller-manager-fbcb9db89-8spmv" (UID: "2007fabb-e6dd-4713-823d-f6a8a3cd41f1") : secret "metrics-server-cert" not found Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.614150 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrh5j\" (UniqueName: \"kubernetes.io/projected/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-kube-api-access-rrh5j\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.619324 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87mx6\" (UniqueName: \"kubernetes.io/projected/c21d7933-3e35-48d9-8946-5ffdcc7a42bf-kube-api-access-87mx6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hmm4d\" (UID: \"c21d7933-3e35-48d9-8946-5ffdcc7a42bf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hmm4d" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.653774 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mtqtq" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.657280 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-95dsz"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.671335 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-q92r7"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.697708 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9\" (UID: \"00b4788a-4566-469f-8731-51700725fea0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:08:12 crc kubenswrapper[4937]: E0225 16:08:12.698039 4937 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 16:08:12 crc kubenswrapper[4937]: E0225 16:08:12.698104 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert podName:00b4788a-4566-469f-8731-51700725fea0 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:13.698080599 +0000 UTC m=+1344.711472499 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" (UID: "00b4788a-4566-469f-8731-51700725fea0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.719391 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.754876 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hmm4d" Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.813022 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-k7z4s"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.859859 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-lnw2m"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.889362 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-xhxm2"] Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.951026 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-xhxm2" event={"ID":"7f4f0820-dd56-4d0b-aa5e-70dcab23e568","Type":"ContainerStarted","Data":"b179569884497960f6ecb0b0b504cc4f55865324b93af11dcd711aca2b598deb"} Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.955048 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lkw74" event={"ID":"806dde6d-ac75-47d7-98e2-0ba5959614a3","Type":"ContainerStarted","Data":"50857bca76f7fd2294372fdc53ff095257e20dd7a0a090c475e18fb8305e227c"} Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.959880 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-k7z4s" event={"ID":"8132d735-0341-43be-93de-730c15511083","Type":"ContainerStarted","Data":"14434ae6bbd7657f87ef1a58b29705d9dfcb1504abdff3319542a99457368318"} Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.961818 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-95dsz" event={"ID":"5c7c6408-d0c4-42ea-ae7b-e10b49e13355","Type":"ContainerStarted","Data":"e82c327e030626eb7d5289e155404ee5da9d0358732d1180aa97303cec94767d"} Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.964237 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-q92r7" event={"ID":"079024f7-46b2-46fa-b96b-e4dca470cb4b","Type":"ContainerStarted","Data":"95e942fbb886002ba9901b0d5cabc688d2e67250a87ff671f83da7b4fd871394"} Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.965219 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-72hwb" event={"ID":"1e2c3857-1279-466f-8da3-ea1f5cf13893","Type":"ContainerStarted","Data":"724dcfe0e2527f6f6301cb0f96bad68cca9e26430f03766eaec7aa7d1f436634"} Feb 25 16:08:12 crc kubenswrapper[4937]: I0225 16:08:12.966973 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-lnw2m" event={"ID":"7b01abed-0e59-495b-8b5e-2229c8d3215f","Type":"ContainerStarted","Data":"3325b09c3f5004e4b4ade23a61041bb7902de36d7417c5dc84b4c4dd27739816"} Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.080630 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-g82nw"] Feb 25 16:08:13 crc kubenswrapper[4937]: W0225 16:08:13.086709 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6bcab89_8beb_4879_8596_3a24805bd835.slice/crio-26ed95d0d4529915979d4efff866936c9707b8e409575961f0e62d482cce61cd WatchSource:0}: Error finding container 26ed95d0d4529915979d4efff866936c9707b8e409575961f0e62d482cce61cd: Status 404 returned error can't find the container with id 26ed95d0d4529915979d4efff866936c9707b8e409575961f0e62d482cce61cd Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.094675 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-vt5mw"] Feb 25 16:08:13 crc kubenswrapper[4937]: W0225 16:08:13.103312 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cb3892f_a950_4dc7_9b9b_0db2876c569d.slice/crio-2a692031a880f335d3c9565d259cb09e7327461f74e73fdf9336989fb67b8cac WatchSource:0}: Error finding container 2a692031a880f335d3c9565d259cb09e7327461f74e73fdf9336989fb67b8cac: Status 404 returned error can't find the container with id 2a692031a880f335d3c9565d259cb09e7327461f74e73fdf9336989fb67b8cac Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.107168 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.107230 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.107368 4937 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.107410 4937 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.107429 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs podName:2007fabb-e6dd-4713-823d-f6a8a3cd41f1 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:14.107414137 +0000 UTC m=+1345.120806027 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs") pod "openstack-operator-controller-manager-fbcb9db89-8spmv" (UID: "2007fabb-e6dd-4713-823d-f6a8a3cd41f1") : secret "webhook-server-cert" not found Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.107451 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs podName:2007fabb-e6dd-4713-823d-f6a8a3cd41f1 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:14.107437828 +0000 UTC m=+1345.120829718 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs") pod "openstack-operator-controller-manager-fbcb9db89-8spmv" (UID: "2007fabb-e6dd-4713-823d-f6a8a3cd41f1") : secret "metrics-server-cert" not found Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.311065 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-bz4hc"] Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.319034 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk"] Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.376649 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gwjhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-659dc6bbfc-vh5zk_openstack-operators(4e98f637-2524-43db-9b27-4bd68ae19bf4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.380193 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk" podUID="4e98f637-2524-43db-9b27-4bd68ae19bf4" Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.383069 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk"] Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.394725 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-sw29j"] Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.403158 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-ntf28"] Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.485582 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-mtqtq"] Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.491092 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b"] Feb 25 16:08:13 crc kubenswrapper[4937]: W0225 16:08:13.497443 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dde13f7_ba29_4c24_94e0_052d622fe88c.slice/crio-ef1ee5eadb4aa3fb2ec4e7132c5ace27eac68c03c2005641d487996bf4df884d WatchSource:0}: Error finding container ef1ee5eadb4aa3fb2ec4e7132c5ace27eac68c03c2005641d487996bf4df884d: Status 404 returned error can't find the container with id ef1ee5eadb4aa3fb2ec4e7132c5ace27eac68c03c2005641d487996bf4df884d Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.499227 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vmw9f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-s8x4b_openstack-operators(2dde13f7-ba29-4c24-94e0-052d622fe88c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.500947 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b" podUID="2dde13f7-ba29-4c24-94e0-052d622fe88c" Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.507877 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs"] Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.511920 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert\") pod \"infra-operator-controller-manager-79d975b745-vjpzc\" (UID: \"88ef567f-e68d-47aa-9788-4307003a77a0\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.512343 4937 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.512667 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert podName:88ef567f-e68d-47aa-9788-4307003a77a0 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:15.512640442 +0000 UTC m=+1346.526032392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert") pod "infra-operator-controller-manager-79d975b745-vjpzc" (UID: "88ef567f-e68d-47aa-9788-4307003a77a0") : secret "infra-operator-webhook-server-cert" not found Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.514566 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf"] Feb 25 16:08:13 crc kubenswrapper[4937]: W0225 16:08:13.519237 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeefbad00_59b6_4e7c_b056_ba07663a665f.slice/crio-9bfa8ca1e6eca1901099ee7c5813b21c8fbe04c6dc37f5755da6e1f356e48da8 WatchSource:0}: Error finding container 9bfa8ca1e6eca1901099ee7c5813b21c8fbe04c6dc37f5755da6e1f356e48da8: Status 404 returned error can't find the container with id 9bfa8ca1e6eca1901099ee7c5813b21c8fbe04c6dc37f5755da6e1f356e48da8 Feb 25 16:08:13 crc kubenswrapper[4937]: W0225 16:08:13.521497 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8448aa3_7cd0_4732_ad80_99fbefc125a6.slice/crio-ff8ff5d7a07775a2d3506814e78eb58b98d7028f2644b29bce4d3e6d966cfd01 WatchSource:0}: Error finding container ff8ff5d7a07775a2d3506814e78eb58b98d7028f2644b29bce4d3e6d966cfd01: Status 404 returned error can't find the container with id ff8ff5d7a07775a2d3506814e78eb58b98d7028f2644b29bce4d3e6d966cfd01 Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.523729 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qpl6s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-z2bqs_openstack-operators(b8448aa3-7cd0-4732-ad80-99fbefc125a6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.525143 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.246:5001/openstack-k8s-operators/telemetry-operator:b962dba8f9ac766cc83bb429874f940b7f07744f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d9n7p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-78747bd5c7-dtngf_openstack-operators(eefbad00-59b6-4e7c-b056-ba07663a665f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.525296 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs" podUID="b8448aa3-7cd0-4732-ad80-99fbefc125a6" Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.526684 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf" podUID="eefbad00-59b6-4e7c-b056-ba07663a665f" Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.620392 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hmm4d"] Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.624685 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-87mx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-hmm4d_openstack-operators(c21d7933-3e35-48d9-8946-5ffdcc7a42bf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.627084 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hmm4d" podUID="c21d7933-3e35-48d9-8946-5ffdcc7a42bf" Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.631855 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd"] Feb 25 16:08:13 crc kubenswrapper[4937]: W0225 16:08:13.642246 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d73c9f2_ead1_410a_ad35_16b7ba251daa.slice/crio-128db7ef3f169612463926e5ebc865934ffb8a39a3ff1b64631b0353dec55228 WatchSource:0}: Error finding container 128db7ef3f169612463926e5ebc865934ffb8a39a3ff1b64631b0353dec55228: Status 404 returned error can't find the container with id 128db7ef3f169612463926e5ebc865934ffb8a39a3ff1b64631b0353dec55228 Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.645717 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fjghw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-pkrhd_openstack-operators(5d73c9f2-ead1-410a-ad35-16b7ba251daa): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.647280 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd" podUID="5d73c9f2-ead1-410a-ad35-16b7ba251daa" Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.728235 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9\" (UID: \"00b4788a-4566-469f-8731-51700725fea0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.728416 4937 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.728508 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert podName:00b4788a-4566-469f-8731-51700725fea0 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:15.728465756 +0000 UTC m=+1346.741857646 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" (UID: "00b4788a-4566-469f-8731-51700725fea0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.986035 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk" event={"ID":"4e98f637-2524-43db-9b27-4bd68ae19bf4","Type":"ContainerStarted","Data":"1ba38b3cde03738f2dc3134fb906086a67c198fb4410837a1ccf1b82ca9e9fc2"} Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.987304 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vt5mw" event={"ID":"e6bcab89-8beb-4879-8596-3a24805bd835","Type":"ContainerStarted","Data":"26ed95d0d4529915979d4efff866936c9707b8e409575961f0e62d482cce61cd"} Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.989986 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-g82nw" event={"ID":"6cb3892f-a950-4dc7-9b9b-0db2876c569d","Type":"ContainerStarted","Data":"2a692031a880f335d3c9565d259cb09e7327461f74e73fdf9336989fb67b8cac"} Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.990046 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk" podUID="4e98f637-2524-43db-9b27-4bd68ae19bf4" Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.990941 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bz4hc" event={"ID":"23514cd7-1535-4c0a-a090-68c39654dad2","Type":"ContainerStarted","Data":"a145f6dc4d5e37402ad42dc87080ed5cc5025d56e85307ac2896b1bab607c3f5"} Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.992335 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf" event={"ID":"eefbad00-59b6-4e7c-b056-ba07663a665f","Type":"ContainerStarted","Data":"9bfa8ca1e6eca1901099ee7c5813b21c8fbe04c6dc37f5755da6e1f356e48da8"} Feb 25 16:08:13 crc kubenswrapper[4937]: E0225 16:08:13.993376 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.246:5001/openstack-k8s-operators/telemetry-operator:b962dba8f9ac766cc83bb429874f940b7f07744f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf" podUID="eefbad00-59b6-4e7c-b056-ba07663a665f" Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.995928 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-ntf28" event={"ID":"b688ff11-a838-4c26-90bd-974c871f4d44","Type":"ContainerStarted","Data":"8d85fc0c57b550ee156724c94e166e36938b8dc3e678709f1baa0531a90bd5be"} Feb 25 16:08:13 crc kubenswrapper[4937]: I0225 16:08:13.997535 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mtqtq" event={"ID":"b8a9d073-1b33-4184-8727-28c957c96e5f","Type":"ContainerStarted","Data":"e4e14e13dcd06fca6a1bc78f0568a7e7a3c086921bcc1db9410035c6d9372489"} Feb 25 16:08:14 crc kubenswrapper[4937]: I0225 16:08:14.006974 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hmm4d" event={"ID":"c21d7933-3e35-48d9-8946-5ffdcc7a42bf","Type":"ContainerStarted","Data":"0bf8786229e7c89e418250a227e1f347a67b1542ac5184984a88e8a8205bbe3c"} Feb 25 16:08:14 crc kubenswrapper[4937]: E0225 16:08:14.009409 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hmm4d" podUID="c21d7933-3e35-48d9-8946-5ffdcc7a42bf" Feb 25 16:08:14 crc kubenswrapper[4937]: E0225 16:08:14.010988 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b" podUID="2dde13f7-ba29-4c24-94e0-052d622fe88c" Feb 25 16:08:14 crc kubenswrapper[4937]: I0225 16:08:14.009702 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b" event={"ID":"2dde13f7-ba29-4c24-94e0-052d622fe88c","Type":"ContainerStarted","Data":"ef1ee5eadb4aa3fb2ec4e7132c5ace27eac68c03c2005641d487996bf4df884d"} Feb 25 16:08:14 crc kubenswrapper[4937]: I0225 16:08:14.011543 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-sw29j" event={"ID":"18df78fd-5382-4716-9708-4e669508c898","Type":"ContainerStarted","Data":"fca3fbcb41c734d3a363af1bfd1c7f06012ff6ac5ebbb3098499a8a5db4ee9b7"} Feb 25 16:08:14 crc kubenswrapper[4937]: I0225 16:08:14.013782 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd" event={"ID":"5d73c9f2-ead1-410a-ad35-16b7ba251daa","Type":"ContainerStarted","Data":"128db7ef3f169612463926e5ebc865934ffb8a39a3ff1b64631b0353dec55228"} Feb 25 16:08:14 crc kubenswrapper[4937]: I0225 16:08:14.019286 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk" event={"ID":"42c84a2f-b585-49c5-adb6-fb83ffecef77","Type":"ContainerStarted","Data":"cdf30bd5e397255151bb0346ec103c701e83f323fa00a79341ff103c11024a31"} Feb 25 16:08:14 crc kubenswrapper[4937]: E0225 16:08:14.019946 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd" podUID="5d73c9f2-ead1-410a-ad35-16b7ba251daa" Feb 25 16:08:14 crc kubenswrapper[4937]: I0225 16:08:14.021259 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs" event={"ID":"b8448aa3-7cd0-4732-ad80-99fbefc125a6","Type":"ContainerStarted","Data":"ff8ff5d7a07775a2d3506814e78eb58b98d7028f2644b29bce4d3e6d966cfd01"} Feb 25 16:08:14 crc kubenswrapper[4937]: E0225 16:08:14.022677 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs" podUID="b8448aa3-7cd0-4732-ad80-99fbefc125a6" Feb 25 16:08:14 crc kubenswrapper[4937]: I0225 16:08:14.133501 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:14 crc kubenswrapper[4937]: I0225 16:08:14.133664 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:14 crc kubenswrapper[4937]: E0225 16:08:14.134337 4937 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 16:08:14 crc kubenswrapper[4937]: E0225 16:08:14.134570 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs podName:2007fabb-e6dd-4713-823d-f6a8a3cd41f1 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:16.134556553 +0000 UTC m=+1347.147948443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs") pod "openstack-operator-controller-manager-fbcb9db89-8spmv" (UID: "2007fabb-e6dd-4713-823d-f6a8a3cd41f1") : secret "metrics-server-cert" not found Feb 25 16:08:14 crc kubenswrapper[4937]: E0225 16:08:14.135955 4937 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 16:08:14 crc kubenswrapper[4937]: E0225 16:08:14.135990 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs podName:2007fabb-e6dd-4713-823d-f6a8a3cd41f1 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:16.135979209 +0000 UTC m=+1347.149371099 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs") pod "openstack-operator-controller-manager-fbcb9db89-8spmv" (UID: "2007fabb-e6dd-4713-823d-f6a8a3cd41f1") : secret "webhook-server-cert" not found Feb 25 16:08:15 crc kubenswrapper[4937]: E0225 16:08:15.033218 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b" podUID="2dde13f7-ba29-4c24-94e0-052d622fe88c" Feb 25 16:08:15 crc kubenswrapper[4937]: E0225 16:08:15.033425 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hmm4d" podUID="c21d7933-3e35-48d9-8946-5ffdcc7a42bf" Feb 25 16:08:15 crc kubenswrapper[4937]: E0225 16:08:15.033550 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd" podUID="5d73c9f2-ead1-410a-ad35-16b7ba251daa" Feb 25 16:08:15 crc kubenswrapper[4937]: E0225 16:08:15.033586 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.246:5001/openstack-k8s-operators/telemetry-operator:b962dba8f9ac766cc83bb429874f940b7f07744f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf" podUID="eefbad00-59b6-4e7c-b056-ba07663a665f" Feb 25 16:08:15 crc kubenswrapper[4937]: E0225 16:08:15.034180 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk" podUID="4e98f637-2524-43db-9b27-4bd68ae19bf4" Feb 25 16:08:15 crc kubenswrapper[4937]: E0225 16:08:15.034227 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs" podUID="b8448aa3-7cd0-4732-ad80-99fbefc125a6" Feb 25 16:08:15 crc kubenswrapper[4937]: I0225 16:08:15.555874 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert\") pod \"infra-operator-controller-manager-79d975b745-vjpzc\" (UID: \"88ef567f-e68d-47aa-9788-4307003a77a0\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:08:15 crc kubenswrapper[4937]: E0225 16:08:15.556133 4937 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 16:08:15 crc kubenswrapper[4937]: E0225 16:08:15.556241 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert podName:88ef567f-e68d-47aa-9788-4307003a77a0 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:19.556223996 +0000 UTC m=+1350.569615876 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert") pod "infra-operator-controller-manager-79d975b745-vjpzc" (UID: "88ef567f-e68d-47aa-9788-4307003a77a0") : secret "infra-operator-webhook-server-cert" not found Feb 25 16:08:15 crc kubenswrapper[4937]: I0225 16:08:15.758209 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9\" (UID: \"00b4788a-4566-469f-8731-51700725fea0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:08:15 crc kubenswrapper[4937]: E0225 16:08:15.758330 4937 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 16:08:15 crc kubenswrapper[4937]: E0225 16:08:15.758394 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert podName:00b4788a-4566-469f-8731-51700725fea0 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:19.758374617 +0000 UTC m=+1350.771766507 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" (UID: "00b4788a-4566-469f-8731-51700725fea0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 16:08:16 crc kubenswrapper[4937]: I0225 16:08:16.164390 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:16 crc kubenswrapper[4937]: I0225 16:08:16.164760 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:16 crc kubenswrapper[4937]: E0225 16:08:16.164606 4937 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 16:08:16 crc kubenswrapper[4937]: E0225 16:08:16.164857 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs podName:2007fabb-e6dd-4713-823d-f6a8a3cd41f1 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:20.164838333 +0000 UTC m=+1351.178230213 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs") pod "openstack-operator-controller-manager-fbcb9db89-8spmv" (UID: "2007fabb-e6dd-4713-823d-f6a8a3cd41f1") : secret "webhook-server-cert" not found Feb 25 16:08:16 crc kubenswrapper[4937]: E0225 16:08:16.164923 4937 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 16:08:16 crc kubenswrapper[4937]: E0225 16:08:16.164972 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs podName:2007fabb-e6dd-4713-823d-f6a8a3cd41f1 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:20.164959066 +0000 UTC m=+1351.178350956 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs") pod "openstack-operator-controller-manager-fbcb9db89-8spmv" (UID: "2007fabb-e6dd-4713-823d-f6a8a3cd41f1") : secret "metrics-server-cert" not found Feb 25 16:08:19 crc kubenswrapper[4937]: I0225 16:08:19.614423 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert\") pod \"infra-operator-controller-manager-79d975b745-vjpzc\" (UID: \"88ef567f-e68d-47aa-9788-4307003a77a0\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:08:19 crc kubenswrapper[4937]: E0225 16:08:19.614569 4937 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 16:08:19 crc kubenswrapper[4937]: E0225 16:08:19.614626 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert podName:88ef567f-e68d-47aa-9788-4307003a77a0 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:27.614611261 +0000 UTC m=+1358.628003151 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert") pod "infra-operator-controller-manager-79d975b745-vjpzc" (UID: "88ef567f-e68d-47aa-9788-4307003a77a0") : secret "infra-operator-webhook-server-cert" not found Feb 25 16:08:19 crc kubenswrapper[4937]: I0225 16:08:19.817714 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9\" (UID: \"00b4788a-4566-469f-8731-51700725fea0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:08:19 crc kubenswrapper[4937]: E0225 16:08:19.817937 4937 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 16:08:19 crc kubenswrapper[4937]: E0225 16:08:19.818332 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert podName:00b4788a-4566-469f-8731-51700725fea0 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:27.818310911 +0000 UTC m=+1358.831702811 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" (UID: "00b4788a-4566-469f-8731-51700725fea0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 16:08:20 crc kubenswrapper[4937]: I0225 16:08:20.224396 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:20 crc kubenswrapper[4937]: I0225 16:08:20.224743 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:20 crc kubenswrapper[4937]: E0225 16:08:20.224918 4937 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 16:08:20 crc kubenswrapper[4937]: E0225 16:08:20.225029 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs podName:2007fabb-e6dd-4713-823d-f6a8a3cd41f1 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:28.225013273 +0000 UTC m=+1359.238405163 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs") pod "openstack-operator-controller-manager-fbcb9db89-8spmv" (UID: "2007fabb-e6dd-4713-823d-f6a8a3cd41f1") : secret "webhook-server-cert" not found Feb 25 16:08:20 crc kubenswrapper[4937]: E0225 16:08:20.225067 4937 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 16:08:20 crc kubenswrapper[4937]: E0225 16:08:20.225172 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs podName:2007fabb-e6dd-4713-823d-f6a8a3cd41f1 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:28.225164497 +0000 UTC m=+1359.238556387 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs") pod "openstack-operator-controller-manager-fbcb9db89-8spmv" (UID: "2007fabb-e6dd-4713-823d-f6a8a3cd41f1") : secret "metrics-server-cert" not found Feb 25 16:08:22 crc kubenswrapper[4937]: I0225 16:08:22.051121 4937 scope.go:117] "RemoveContainer" containerID="8792f524feedae4df73e9bab681c0463dc904dabe95f74cc4a2f01194714eded" Feb 25 16:08:27 crc kubenswrapper[4937]: I0225 16:08:27.650113 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert\") pod \"infra-operator-controller-manager-79d975b745-vjpzc\" (UID: \"88ef567f-e68d-47aa-9788-4307003a77a0\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:08:27 crc kubenswrapper[4937]: E0225 16:08:27.650291 4937 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 16:08:27 crc kubenswrapper[4937]: E0225 16:08:27.650806 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert podName:88ef567f-e68d-47aa-9788-4307003a77a0 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:43.650788075 +0000 UTC m=+1374.664179965 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert") pod "infra-operator-controller-manager-79d975b745-vjpzc" (UID: "88ef567f-e68d-47aa-9788-4307003a77a0") : secret "infra-operator-webhook-server-cert" not found Feb 25 16:08:27 crc kubenswrapper[4937]: I0225 16:08:27.852610 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9\" (UID: \"00b4788a-4566-469f-8731-51700725fea0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:08:27 crc kubenswrapper[4937]: E0225 16:08:27.852792 4937 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 16:08:27 crc kubenswrapper[4937]: E0225 16:08:27.852864 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert podName:00b4788a-4566-469f-8731-51700725fea0 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:43.852846309 +0000 UTC m=+1374.866238199 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" (UID: "00b4788a-4566-469f-8731-51700725fea0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 16:08:28 crc kubenswrapper[4937]: I0225 16:08:28.257906 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:28 crc kubenswrapper[4937]: E0225 16:08:28.258058 4937 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 16:08:28 crc kubenswrapper[4937]: I0225 16:08:28.258135 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:28 crc kubenswrapper[4937]: E0225 16:08:28.258219 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs podName:2007fabb-e6dd-4713-823d-f6a8a3cd41f1 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:44.258190569 +0000 UTC m=+1375.271582459 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs") pod "openstack-operator-controller-manager-fbcb9db89-8spmv" (UID: "2007fabb-e6dd-4713-823d-f6a8a3cd41f1") : secret "webhook-server-cert" not found Feb 25 16:08:28 crc kubenswrapper[4937]: E0225 16:08:28.258291 4937 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 16:08:28 crc kubenswrapper[4937]: E0225 16:08:28.258808 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs podName:2007fabb-e6dd-4713-823d-f6a8a3cd41f1 nodeName:}" failed. No retries permitted until 2026-02-25 16:08:44.258793114 +0000 UTC m=+1375.272185004 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs") pod "openstack-operator-controller-manager-fbcb9db89-8spmv" (UID: "2007fabb-e6dd-4713-823d-f6a8a3cd41f1") : secret "metrics-server-cert" not found Feb 25 16:08:32 crc kubenswrapper[4937]: E0225 16:08:32.435624 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 25 16:08:32 crc kubenswrapper[4937]: E0225 16:08:32.436240 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-84tx9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-n64lk_openstack-operators(42c84a2f-b585-49c5-adb6-fb83ffecef77): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:08:32 crc kubenswrapper[4937]: E0225 16:08:32.437994 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk" podUID="42c84a2f-b585-49c5-adb6-fb83ffecef77" Feb 25 16:08:33 crc kubenswrapper[4937]: E0225 16:08:33.187504 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk" podUID="42c84a2f-b585-49c5-adb6-fb83ffecef77" Feb 25 16:08:34 crc kubenswrapper[4937]: I0225 16:08:34.197790 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-95dsz" event={"ID":"5c7c6408-d0c4-42ea-ae7b-e10b49e13355","Type":"ContainerStarted","Data":"f957a7d0a02b471c73757ce9e0c7f18392fb22940cc9afae8f61adbaba98836f"} Feb 25 16:08:34 crc kubenswrapper[4937]: I0225 16:08:34.198356 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-95dsz" Feb 25 16:08:34 crc kubenswrapper[4937]: I0225 16:08:34.217896 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-95dsz" podStartSLOduration=10.220080721 podStartE2EDuration="23.217879975s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:12.696624602 +0000 UTC m=+1343.710016492" lastFinishedPulling="2026-02-25 16:08:25.694423856 +0000 UTC m=+1356.707815746" observedRunningTime="2026-02-25 16:08:34.21091693 +0000 UTC m=+1365.224308810" watchObservedRunningTime="2026-02-25 16:08:34.217879975 +0000 UTC m=+1365.231271865" Feb 25 16:08:35 crc kubenswrapper[4937]: I0225 16:08:35.208515 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vt5mw" event={"ID":"e6bcab89-8beb-4879-8596-3a24805bd835","Type":"ContainerStarted","Data":"4aef113c73cec0a2108ba13568d9771486f5674f483f2475ccbe4391fcd0ba02"} Feb 25 16:08:35 crc kubenswrapper[4937]: I0225 16:08:35.209047 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vt5mw" Feb 25 16:08:35 crc kubenswrapper[4937]: I0225 16:08:35.228008 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vt5mw" podStartSLOduration=4.87869941 podStartE2EDuration="24.227990503s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:13.089592629 +0000 UTC m=+1344.102984519" lastFinishedPulling="2026-02-25 16:08:32.438883722 +0000 UTC m=+1363.452275612" observedRunningTime="2026-02-25 16:08:35.226222179 +0000 UTC m=+1366.239614089" watchObservedRunningTime="2026-02-25 16:08:35.227990503 +0000 UTC m=+1366.241382393" Feb 25 16:08:40 crc kubenswrapper[4937]: I0225 16:08:40.253896 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-lnw2m" event={"ID":"7b01abed-0e59-495b-8b5e-2229c8d3215f","Type":"ContainerStarted","Data":"819ea9c9d4dedd9d7979508aaf07b396ba658f8fec00f32bd9ee3ab98c336575"} Feb 25 16:08:40 crc kubenswrapper[4937]: I0225 16:08:40.254467 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-lnw2m" Feb 25 16:08:40 crc kubenswrapper[4937]: I0225 16:08:40.257459 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-q92r7" event={"ID":"079024f7-46b2-46fa-b96b-e4dca470cb4b","Type":"ContainerStarted","Data":"fb45f529b8f63c6c930c87bbae3be173be303cb179bb43527250b639e8b870e2"} Feb 25 16:08:40 crc kubenswrapper[4937]: I0225 16:08:40.257824 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-q92r7" Feb 25 16:08:40 crc kubenswrapper[4937]: I0225 16:08:40.259260 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bz4hc" event={"ID":"23514cd7-1535-4c0a-a090-68c39654dad2","Type":"ContainerStarted","Data":"8e97882db9e9bb67d83bbac04eda988e5c8bcadf6d6eabefe742dcaec88305f6"} Feb 25 16:08:40 crc kubenswrapper[4937]: I0225 16:08:40.259414 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bz4hc" Feb 25 16:08:40 crc kubenswrapper[4937]: I0225 16:08:40.302215 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bz4hc" podStartSLOduration=10.177549106 podStartE2EDuration="29.302193993s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:13.315811895 +0000 UTC m=+1344.329203785" lastFinishedPulling="2026-02-25 16:08:32.440456782 +0000 UTC m=+1363.453848672" observedRunningTime="2026-02-25 16:08:40.299057325 +0000 UTC m=+1371.312449225" watchObservedRunningTime="2026-02-25 16:08:40.302193993 +0000 UTC m=+1371.315585883" Feb 25 16:08:40 crc kubenswrapper[4937]: I0225 16:08:40.304360 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-lnw2m" podStartSLOduration=9.745342513 podStartE2EDuration="29.304346067s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:12.881426607 +0000 UTC m=+1343.894818497" lastFinishedPulling="2026-02-25 16:08:32.440430161 +0000 UTC m=+1363.453822051" observedRunningTime="2026-02-25 16:08:40.274073928 +0000 UTC m=+1371.287465828" watchObservedRunningTime="2026-02-25 16:08:40.304346067 +0000 UTC m=+1371.317737947" Feb 25 16:08:40 crc kubenswrapper[4937]: I0225 16:08:40.327189 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-q92r7" podStartSLOduration=8.370583672 podStartE2EDuration="29.32717181s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:12.699111445 +0000 UTC m=+1343.712503335" lastFinishedPulling="2026-02-25 16:08:33.655699583 +0000 UTC m=+1364.669091473" observedRunningTime="2026-02-25 16:08:40.325822616 +0000 UTC m=+1371.339214526" watchObservedRunningTime="2026-02-25 16:08:40.32717181 +0000 UTC m=+1371.340563700" Feb 25 16:08:41 crc kubenswrapper[4937]: I0225 16:08:41.267470 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lkw74" event={"ID":"806dde6d-ac75-47d7-98e2-0ba5959614a3","Type":"ContainerStarted","Data":"4de98306a60dd68c0b7648d4e54daec5da30e58ade24a911895997041b0332a4"} Feb 25 16:08:41 crc kubenswrapper[4937]: I0225 16:08:41.270519 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk" event={"ID":"4e98f637-2524-43db-9b27-4bd68ae19bf4","Type":"ContainerStarted","Data":"d083ec942f1519a330ce3e0b0955eac65f42cc2ced336f31dff5dcf1c18f2b27"} Feb 25 16:08:41 crc kubenswrapper[4937]: I0225 16:08:41.943392 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-95dsz" Feb 25 16:08:41 crc kubenswrapper[4937]: I0225 16:08:41.968696 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk" podStartSLOduration=4.049688791 podStartE2EDuration="30.968678346s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:13.376512211 +0000 UTC m=+1344.389904101" lastFinishedPulling="2026-02-25 16:08:40.295501766 +0000 UTC m=+1371.308893656" observedRunningTime="2026-02-25 16:08:41.297583254 +0000 UTC m=+1372.310975154" watchObservedRunningTime="2026-02-25 16:08:41.968678346 +0000 UTC m=+1372.982070236" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.277128 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-sw29j" event={"ID":"18df78fd-5382-4716-9708-4e669508c898","Type":"ContainerStarted","Data":"25d48b20dd22b673bcc9275007344e6e532409ef23f02401a45e31a69322a8a3"} Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.277506 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-sw29j" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.278147 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd" event={"ID":"5d73c9f2-ead1-410a-ad35-16b7ba251daa","Type":"ContainerStarted","Data":"900b16e6e0466d406fa71314580a8ef9a4f58ecd507cfc116c78875f2147ffff"} Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.278337 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.279277 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-g82nw" event={"ID":"6cb3892f-a950-4dc7-9b9b-0db2876c569d","Type":"ContainerStarted","Data":"cffd87d2cd9173787ba9a535b22e1fc4dcf1c1bc968da493fa675a1a1d860ed4"} Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.279371 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-g82nw" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.280125 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf" event={"ID":"eefbad00-59b6-4e7c-b056-ba07663a665f","Type":"ContainerStarted","Data":"4e4d590e9d7e6c084a1d4ae2d6fd3762f360100a0c82aa8c229f5fc97038eb0b"} Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.280310 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.281510 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-72hwb" event={"ID":"1e2c3857-1279-466f-8da3-ea1f5cf13893","Type":"ContainerStarted","Data":"2a257b6205a3164da77607a3fd26dc6c56be7c4a5df203294fd8a0cad0dff5b3"} Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.281620 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-72hwb" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.282410 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hmm4d" event={"ID":"c21d7933-3e35-48d9-8946-5ffdcc7a42bf","Type":"ContainerStarted","Data":"aa37c311670a1e5e12829a2e872a9964df41acd74d374d0004d6aca9a6512089"} Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.283432 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-xhxm2" event={"ID":"7f4f0820-dd56-4d0b-aa5e-70dcab23e568","Type":"ContainerStarted","Data":"8c05ea319cb03296ca04931e5096c2fa8ebd7bff1412c994985d60cd32c78dd6"} Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.283517 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-xhxm2" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.284275 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-k7z4s" event={"ID":"8132d735-0341-43be-93de-730c15511083","Type":"ContainerStarted","Data":"001606ad67a6266b9410c8fd699612c24b417903387c21381f200f6bcd03d137"} Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.284341 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-k7z4s" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.285457 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b" event={"ID":"2dde13f7-ba29-4c24-94e0-052d622fe88c","Type":"ContainerStarted","Data":"1ff814e7199261ef10ac034d4a433171924dbf30128bd169156224a09b5553ac"} Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.285629 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.286383 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-ntf28" event={"ID":"b688ff11-a838-4c26-90bd-974c871f4d44","Type":"ContainerStarted","Data":"570670a43cf5b1fac10d4ce3f23c4968ea037ef6fb18b6edd20f1e14456495b1"} Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.287278 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mtqtq" event={"ID":"b8a9d073-1b33-4184-8727-28c957c96e5f","Type":"ContainerStarted","Data":"7f52cfdd437589fb4f7c5e797e3677791cc747fdb834a167def2645513a5cf9b"} Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.287370 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mtqtq" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.288199 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs" event={"ID":"b8448aa3-7cd0-4732-ad80-99fbefc125a6","Type":"ContainerStarted","Data":"2eded51e95490eb06bc818ab69bba2dabdc44b650c205c251b0d0862f0dcadd9"} Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.288333 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lkw74" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.288385 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.305602 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-sw29j" podStartSLOduration=11.014324574 podStartE2EDuration="31.30558706s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:13.364426957 +0000 UTC m=+1344.377818847" lastFinishedPulling="2026-02-25 16:08:33.655689443 +0000 UTC m=+1364.669081333" observedRunningTime="2026-02-25 16:08:42.300630696 +0000 UTC m=+1373.314022586" watchObservedRunningTime="2026-02-25 16:08:42.30558706 +0000 UTC m=+1373.318978940" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.319700 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b" podStartSLOduration=4.493268907 podStartE2EDuration="31.319687083s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:13.499096252 +0000 UTC m=+1344.512488142" lastFinishedPulling="2026-02-25 16:08:40.325514428 +0000 UTC m=+1371.338906318" observedRunningTime="2026-02-25 16:08:42.317234112 +0000 UTC m=+1373.330626012" watchObservedRunningTime="2026-02-25 16:08:42.319687083 +0000 UTC m=+1373.333078973" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.322604 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-ntf28" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.343606 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.346106 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-k7z4s" podStartSLOduration=10.56351059 podStartE2EDuration="31.346091975s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:12.873440376 +0000 UTC m=+1343.886832256" lastFinishedPulling="2026-02-25 16:08:33.656021751 +0000 UTC m=+1364.669413641" observedRunningTime="2026-02-25 16:08:42.344436634 +0000 UTC m=+1373.357828524" watchObservedRunningTime="2026-02-25 16:08:42.346091975 +0000 UTC m=+1373.359483865" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.378996 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-xhxm2" podStartSLOduration=11.85432663 podStartE2EDuration="31.37897794s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:12.914361465 +0000 UTC m=+1343.927753355" lastFinishedPulling="2026-02-25 16:08:32.439012775 +0000 UTC m=+1363.452404665" observedRunningTime="2026-02-25 16:08:42.376304893 +0000 UTC m=+1373.389696783" watchObservedRunningTime="2026-02-25 16:08:42.37897794 +0000 UTC m=+1373.392369830" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.403340 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs" podStartSLOduration=4.630410877 podStartE2EDuration="31.40332133s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:13.523614378 +0000 UTC m=+1344.537006268" lastFinishedPulling="2026-02-25 16:08:40.296524831 +0000 UTC m=+1371.309916721" observedRunningTime="2026-02-25 16:08:42.399301359 +0000 UTC m=+1373.412693249" watchObservedRunningTime="2026-02-25 16:08:42.40332133 +0000 UTC m=+1373.416713220" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.423807 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mtqtq" podStartSLOduration=11.473119939 podStartE2EDuration="30.423792873s" podCreationTimestamp="2026-02-25 16:08:12 +0000 UTC" firstStartedPulling="2026-02-25 16:08:13.490305691 +0000 UTC m=+1344.503697591" lastFinishedPulling="2026-02-25 16:08:32.440978635 +0000 UTC m=+1363.454370525" observedRunningTime="2026-02-25 16:08:42.420703695 +0000 UTC m=+1373.434095595" watchObservedRunningTime="2026-02-25 16:08:42.423792873 +0000 UTC m=+1373.437184763" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.438562 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vt5mw" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.462770 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-g82nw" podStartSLOduration=12.132909216 podStartE2EDuration="31.46275648s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:13.109064029 +0000 UTC m=+1344.122455919" lastFinishedPulling="2026-02-25 16:08:32.438911293 +0000 UTC m=+1363.452303183" observedRunningTime="2026-02-25 16:08:42.461032696 +0000 UTC m=+1373.474424586" watchObservedRunningTime="2026-02-25 16:08:42.46275648 +0000 UTC m=+1373.476148370" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.510573 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-ntf28" podStartSLOduration=12.435760325 podStartE2EDuration="31.510555938s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:13.364090299 +0000 UTC m=+1344.377482189" lastFinishedPulling="2026-02-25 16:08:32.438885912 +0000 UTC m=+1363.452277802" observedRunningTime="2026-02-25 16:08:42.507771568 +0000 UTC m=+1373.521163458" watchObservedRunningTime="2026-02-25 16:08:42.510555938 +0000 UTC m=+1373.523947828" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.531194 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd" podStartSLOduration=3.879639221 podStartE2EDuration="30.531179845s" podCreationTimestamp="2026-02-25 16:08:12 +0000 UTC" firstStartedPulling="2026-02-25 16:08:13.645565052 +0000 UTC m=+1344.658956942" lastFinishedPulling="2026-02-25 16:08:40.297105676 +0000 UTC m=+1371.310497566" observedRunningTime="2026-02-25 16:08:42.530049866 +0000 UTC m=+1373.543441766" watchObservedRunningTime="2026-02-25 16:08:42.531179845 +0000 UTC m=+1373.544571735" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.551663 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf" podStartSLOduration=10.033326676 podStartE2EDuration="31.551645808s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:13.525047004 +0000 UTC m=+1344.538438894" lastFinishedPulling="2026-02-25 16:08:35.043366136 +0000 UTC m=+1366.056758026" observedRunningTime="2026-02-25 16:08:42.548420627 +0000 UTC m=+1373.561812517" watchObservedRunningTime="2026-02-25 16:08:42.551645808 +0000 UTC m=+1373.565037698" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.566608 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-72hwb" podStartSLOduration=10.562645513 podStartE2EDuration="31.566594282s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:12.651045216 +0000 UTC m=+1343.664437106" lastFinishedPulling="2026-02-25 16:08:33.654993985 +0000 UTC m=+1364.668385875" observedRunningTime="2026-02-25 16:08:42.566015388 +0000 UTC m=+1373.579407278" watchObservedRunningTime="2026-02-25 16:08:42.566594282 +0000 UTC m=+1373.579986172" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.605615 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hmm4d" podStartSLOduration=3.637401698 podStartE2EDuration="30.60559858s" podCreationTimestamp="2026-02-25 16:08:12 +0000 UTC" firstStartedPulling="2026-02-25 16:08:13.624540474 +0000 UTC m=+1344.637932364" lastFinishedPulling="2026-02-25 16:08:40.592737336 +0000 UTC m=+1371.606129246" observedRunningTime="2026-02-25 16:08:42.601132908 +0000 UTC m=+1373.614524808" watchObservedRunningTime="2026-02-25 16:08:42.60559858 +0000 UTC m=+1373.618990470" Feb 25 16:08:42 crc kubenswrapper[4937]: I0225 16:08:42.606132 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lkw74" podStartSLOduration=11.81642246 podStartE2EDuration="31.606128083s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:12.651347764 +0000 UTC m=+1343.664739654" lastFinishedPulling="2026-02-25 16:08:32.441053387 +0000 UTC m=+1363.454445277" observedRunningTime="2026-02-25 16:08:42.580576813 +0000 UTC m=+1373.593968703" watchObservedRunningTime="2026-02-25 16:08:42.606128083 +0000 UTC m=+1373.619519963" Feb 25 16:08:43 crc kubenswrapper[4937]: I0225 16:08:43.675531 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert\") pod \"infra-operator-controller-manager-79d975b745-vjpzc\" (UID: \"88ef567f-e68d-47aa-9788-4307003a77a0\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:08:43 crc kubenswrapper[4937]: I0225 16:08:43.683203 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88ef567f-e68d-47aa-9788-4307003a77a0-cert\") pod \"infra-operator-controller-manager-79d975b745-vjpzc\" (UID: \"88ef567f-e68d-47aa-9788-4307003a77a0\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:08:43 crc kubenswrapper[4937]: I0225 16:08:43.690936 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:08:43 crc kubenswrapper[4937]: I0225 16:08:43.923696 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9\" (UID: \"00b4788a-4566-469f-8731-51700725fea0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:08:43 crc kubenswrapper[4937]: I0225 16:08:43.937194 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b4788a-4566-469f-8731-51700725fea0-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9\" (UID: \"00b4788a-4566-469f-8731-51700725fea0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:08:44 crc kubenswrapper[4937]: I0225 16:08:44.176229 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:08:44 crc kubenswrapper[4937]: I0225 16:08:44.318332 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc"] Feb 25 16:08:44 crc kubenswrapper[4937]: W0225 16:08:44.327882 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88ef567f_e68d_47aa_9788_4307003a77a0.slice/crio-8eef5952ca587976011d37f9b83fc5108f2d93dd8f78c81baaa3e59960250a55 WatchSource:0}: Error finding container 8eef5952ca587976011d37f9b83fc5108f2d93dd8f78c81baaa3e59960250a55: Status 404 returned error can't find the container with id 8eef5952ca587976011d37f9b83fc5108f2d93dd8f78c81baaa3e59960250a55 Feb 25 16:08:44 crc kubenswrapper[4937]: I0225 16:08:44.329740 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:44 crc kubenswrapper[4937]: I0225 16:08:44.330261 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:44 crc kubenswrapper[4937]: I0225 16:08:44.335736 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-metrics-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:44 crc kubenswrapper[4937]: I0225 16:08:44.336401 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2007fabb-e6dd-4713-823d-f6a8a3cd41f1-webhook-certs\") pod \"openstack-operator-controller-manager-fbcb9db89-8spmv\" (UID: \"2007fabb-e6dd-4713-823d-f6a8a3cd41f1\") " pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:44 crc kubenswrapper[4937]: I0225 16:08:44.429305 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9"] Feb 25 16:08:44 crc kubenswrapper[4937]: W0225 16:08:44.437655 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00b4788a_4566_469f_8731_51700725fea0.slice/crio-999a837f0ee95171fb530bb0c6a102f2dcec186fd8d848453a591427bbdbddb2 WatchSource:0}: Error finding container 999a837f0ee95171fb530bb0c6a102f2dcec186fd8d848453a591427bbdbddb2: Status 404 returned error can't find the container with id 999a837f0ee95171fb530bb0c6a102f2dcec186fd8d848453a591427bbdbddb2 Feb 25 16:08:44 crc kubenswrapper[4937]: I0225 16:08:44.539705 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:44 crc kubenswrapper[4937]: W0225 16:08:44.987857 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2007fabb_e6dd_4713_823d_f6a8a3cd41f1.slice/crio-c4f787f2f025cd59dbd22c7f96fafdb0d25403712fb862cdab555b9152411549 WatchSource:0}: Error finding container c4f787f2f025cd59dbd22c7f96fafdb0d25403712fb862cdab555b9152411549: Status 404 returned error can't find the container with id c4f787f2f025cd59dbd22c7f96fafdb0d25403712fb862cdab555b9152411549 Feb 25 16:08:44 crc kubenswrapper[4937]: I0225 16:08:44.993309 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv"] Feb 25 16:08:45 crc kubenswrapper[4937]: I0225 16:08:45.310123 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" event={"ID":"88ef567f-e68d-47aa-9788-4307003a77a0","Type":"ContainerStarted","Data":"8eef5952ca587976011d37f9b83fc5108f2d93dd8f78c81baaa3e59960250a55"} Feb 25 16:08:45 crc kubenswrapper[4937]: I0225 16:08:45.311331 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" event={"ID":"00b4788a-4566-469f-8731-51700725fea0","Type":"ContainerStarted","Data":"999a837f0ee95171fb530bb0c6a102f2dcec186fd8d848453a591427bbdbddb2"} Feb 25 16:08:45 crc kubenswrapper[4937]: I0225 16:08:45.312986 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" event={"ID":"2007fabb-e6dd-4713-823d-f6a8a3cd41f1","Type":"ContainerStarted","Data":"6724284c08daef4eef1c192612f8577c494fc4cb6e9e2396917e67704a17958e"} Feb 25 16:08:45 crc kubenswrapper[4937]: I0225 16:08:45.313037 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" event={"ID":"2007fabb-e6dd-4713-823d-f6a8a3cd41f1","Type":"ContainerStarted","Data":"c4f787f2f025cd59dbd22c7f96fafdb0d25403712fb862cdab555b9152411549"} Feb 25 16:08:45 crc kubenswrapper[4937]: I0225 16:08:45.313153 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:45 crc kubenswrapper[4937]: I0225 16:08:45.338588 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" podStartSLOduration=33.338571865 podStartE2EDuration="33.338571865s" podCreationTimestamp="2026-02-25 16:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:08:45.337004936 +0000 UTC m=+1376.350396846" watchObservedRunningTime="2026-02-25 16:08:45.338571865 +0000 UTC m=+1376.351963755" Feb 25 16:08:51 crc kubenswrapper[4937]: I0225 16:08:51.913290 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-72hwb" Feb 25 16:08:51 crc kubenswrapper[4937]: I0225 16:08:51.917985 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lkw74" Feb 25 16:08:51 crc kubenswrapper[4937]: I0225 16:08:51.954944 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-q92r7" Feb 25 16:08:51 crc kubenswrapper[4937]: I0225 16:08:51.978971 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-k7z4s" Feb 25 16:08:52 crc kubenswrapper[4937]: I0225 16:08:52.115361 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-lnw2m" Feb 25 16:08:52 crc kubenswrapper[4937]: I0225 16:08:52.212735 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-xhxm2" Feb 25 16:08:52 crc kubenswrapper[4937]: I0225 16:08:52.299443 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-sw29j" Feb 25 16:08:52 crc kubenswrapper[4937]: I0225 16:08:52.310597 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-g82nw" Feb 25 16:08:52 crc kubenswrapper[4937]: I0225 16:08:52.327843 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-ntf28" Feb 25 16:08:52 crc kubenswrapper[4937]: I0225 16:08:52.334790 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bz4hc" Feb 25 16:08:52 crc kubenswrapper[4937]: I0225 16:08:52.355001 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-vh5zk" Feb 25 16:08:52 crc kubenswrapper[4937]: I0225 16:08:52.473921 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-z2bqs" Feb 25 16:08:52 crc kubenswrapper[4937]: I0225 16:08:52.545743 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-s8x4b" Feb 25 16:08:52 crc kubenswrapper[4937]: I0225 16:08:52.561704 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-78747bd5c7-dtngf" Feb 25 16:08:52 crc kubenswrapper[4937]: I0225 16:08:52.657300 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-mtqtq" Feb 25 16:08:52 crc kubenswrapper[4937]: I0225 16:08:52.722452 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkrhd" Feb 25 16:08:54 crc kubenswrapper[4937]: I0225 16:08:54.548162 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-fbcb9db89-8spmv" Feb 25 16:08:59 crc kubenswrapper[4937]: E0225 16:08:59.977547 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e6f7c2a75883f63d270378b283faeee4c4c14fbd74b509c7da82621166f07b24" Feb 25 16:08:59 crc kubenswrapper[4937]: E0225 16:08:59.978352 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e6f7c2a75883f63d270378b283faeee4c4c14fbd74b509c7da82621166f07b24,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z82gc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9_openstack-operators(00b4788a-4566-469f-8731-51700725fea0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:08:59 crc kubenswrapper[4937]: E0225 16:08:59.980542 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" podUID="00b4788a-4566-469f-8731-51700725fea0" Feb 25 16:09:00 crc kubenswrapper[4937]: E0225 16:09:00.431682 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e6f7c2a75883f63d270378b283faeee4c4c14fbd74b509c7da82621166f07b24\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" podUID="00b4788a-4566-469f-8731-51700725fea0" Feb 25 16:09:00 crc kubenswrapper[4937]: E0225 16:09:00.606191 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a" Feb 25 16:09:00 crc kubenswrapper[4937]: E0225 16:09:00.606416 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4rbg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-79d975b745-vjpzc_openstack-operators(88ef567f-e68d-47aa-9788-4307003a77a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:09:00 crc kubenswrapper[4937]: E0225 16:09:00.607604 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" podUID="88ef567f-e68d-47aa-9788-4307003a77a0" Feb 25 16:09:01 crc kubenswrapper[4937]: I0225 16:09:01.439046 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk" event={"ID":"42c84a2f-b585-49c5-adb6-fb83ffecef77","Type":"ContainerStarted","Data":"0cde0296bf670024c8c84d2507419be3e4664149e31d2bdb31e61a9c5f227a7f"} Feb 25 16:09:01 crc kubenswrapper[4937]: I0225 16:09:01.441162 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk" Feb 25 16:09:01 crc kubenswrapper[4937]: E0225 16:09:01.441394 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a\\\"\"" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" podUID="88ef567f-e68d-47aa-9788-4307003a77a0" Feb 25 16:09:01 crc kubenswrapper[4937]: I0225 16:09:01.494177 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk" podStartSLOduration=3.477124115 podStartE2EDuration="50.494152099s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:13.33471729 +0000 UTC m=+1344.348109180" lastFinishedPulling="2026-02-25 16:09:00.351745234 +0000 UTC m=+1391.365137164" observedRunningTime="2026-02-25 16:09:01.48341017 +0000 UTC m=+1392.496802080" watchObservedRunningTime="2026-02-25 16:09:01.494152099 +0000 UTC m=+1392.507544029" Feb 25 16:09:12 crc kubenswrapper[4937]: I0225 16:09:12.366021 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n64lk" Feb 25 16:09:16 crc kubenswrapper[4937]: I0225 16:09:16.578907 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" event={"ID":"88ef567f-e68d-47aa-9788-4307003a77a0","Type":"ContainerStarted","Data":"3a37f85941d81d7c5d58fb897433134b72327961dfc1404215c9cab2f7c8e761"} Feb 25 16:09:16 crc kubenswrapper[4937]: I0225 16:09:16.580668 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:09:16 crc kubenswrapper[4937]: I0225 16:09:16.582017 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" event={"ID":"00b4788a-4566-469f-8731-51700725fea0","Type":"ContainerStarted","Data":"c8e6d184334f74cabd1a5570f96999044ced5f6e903d01fa39edb83047e2b21d"} Feb 25 16:09:16 crc kubenswrapper[4937]: I0225 16:09:16.582616 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:09:16 crc kubenswrapper[4937]: I0225 16:09:16.598342 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" podStartSLOduration=33.613621773 podStartE2EDuration="1m5.598325822s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:44.33263256 +0000 UTC m=+1375.346024450" lastFinishedPulling="2026-02-25 16:09:16.317336609 +0000 UTC m=+1407.330728499" observedRunningTime="2026-02-25 16:09:16.596399844 +0000 UTC m=+1407.609791724" watchObservedRunningTime="2026-02-25 16:09:16.598325822 +0000 UTC m=+1407.611717712" Feb 25 16:09:23 crc kubenswrapper[4937]: I0225 16:09:23.697955 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vjpzc" Feb 25 16:09:23 crc kubenswrapper[4937]: I0225 16:09:23.722509 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" podStartSLOduration=41.153060458 podStartE2EDuration="1m12.722471436s" podCreationTimestamp="2026-02-25 16:08:11 +0000 UTC" firstStartedPulling="2026-02-25 16:08:44.439372306 +0000 UTC m=+1375.452764186" lastFinishedPulling="2026-02-25 16:09:16.008783274 +0000 UTC m=+1407.022175164" observedRunningTime="2026-02-25 16:09:16.632111799 +0000 UTC m=+1407.645503679" watchObservedRunningTime="2026-02-25 16:09:23.722471436 +0000 UTC m=+1414.735863326" Feb 25 16:09:24 crc kubenswrapper[4937]: I0225 16:09:24.183824 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9" Feb 25 16:09:41 crc kubenswrapper[4937]: I0225 16:09:41.494447 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:09:41 crc kubenswrapper[4937]: I0225 16:09:41.494997 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.488588 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c6qp7"] Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.490694 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-c6qp7" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.495383 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.495474 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.495408 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.499672 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-d7qg5" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.508006 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c6qp7"] Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.558604 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rgz7z"] Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.560359 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.562905 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.574672 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rgz7z"] Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.629036 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c5666a-48ac-4d24-9539-bc255bd0ef8a-config\") pod \"dnsmasq-dns-675f4bcbfc-c6qp7\" (UID: \"f7c5666a-48ac-4d24-9539-bc255bd0ef8a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c6qp7" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.629135 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6chw\" (UniqueName: \"kubernetes.io/projected/f7c5666a-48ac-4d24-9539-bc255bd0ef8a-kube-api-access-b6chw\") pod \"dnsmasq-dns-675f4bcbfc-c6qp7\" (UID: \"f7c5666a-48ac-4d24-9539-bc255bd0ef8a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c6qp7" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.730367 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5c34809-17e3-41c0-85f5-a8d427495310-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rgz7z\" (UID: \"d5c34809-17e3-41c0-85f5-a8d427495310\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.730444 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjbw6\" (UniqueName: \"kubernetes.io/projected/d5c34809-17e3-41c0-85f5-a8d427495310-kube-api-access-vjbw6\") pod \"dnsmasq-dns-78dd6ddcc-rgz7z\" (UID: \"d5c34809-17e3-41c0-85f5-a8d427495310\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.730525 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c5666a-48ac-4d24-9539-bc255bd0ef8a-config\") pod \"dnsmasq-dns-675f4bcbfc-c6qp7\" (UID: \"f7c5666a-48ac-4d24-9539-bc255bd0ef8a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c6qp7" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.730735 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5c34809-17e3-41c0-85f5-a8d427495310-config\") pod \"dnsmasq-dns-78dd6ddcc-rgz7z\" (UID: \"d5c34809-17e3-41c0-85f5-a8d427495310\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.730871 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6chw\" (UniqueName: \"kubernetes.io/projected/f7c5666a-48ac-4d24-9539-bc255bd0ef8a-kube-api-access-b6chw\") pod \"dnsmasq-dns-675f4bcbfc-c6qp7\" (UID: \"f7c5666a-48ac-4d24-9539-bc255bd0ef8a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c6qp7" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.731550 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c5666a-48ac-4d24-9539-bc255bd0ef8a-config\") pod \"dnsmasq-dns-675f4bcbfc-c6qp7\" (UID: \"f7c5666a-48ac-4d24-9539-bc255bd0ef8a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c6qp7" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.754721 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6chw\" (UniqueName: \"kubernetes.io/projected/f7c5666a-48ac-4d24-9539-bc255bd0ef8a-kube-api-access-b6chw\") pod \"dnsmasq-dns-675f4bcbfc-c6qp7\" (UID: \"f7c5666a-48ac-4d24-9539-bc255bd0ef8a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c6qp7" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.820954 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-c6qp7" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.832771 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5c34809-17e3-41c0-85f5-a8d427495310-config\") pod \"dnsmasq-dns-78dd6ddcc-rgz7z\" (UID: \"d5c34809-17e3-41c0-85f5-a8d427495310\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.832872 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5c34809-17e3-41c0-85f5-a8d427495310-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rgz7z\" (UID: \"d5c34809-17e3-41c0-85f5-a8d427495310\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.832933 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjbw6\" (UniqueName: \"kubernetes.io/projected/d5c34809-17e3-41c0-85f5-a8d427495310-kube-api-access-vjbw6\") pod \"dnsmasq-dns-78dd6ddcc-rgz7z\" (UID: \"d5c34809-17e3-41c0-85f5-a8d427495310\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.833945 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5c34809-17e3-41c0-85f5-a8d427495310-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rgz7z\" (UID: \"d5c34809-17e3-41c0-85f5-a8d427495310\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.834202 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5c34809-17e3-41c0-85f5-a8d427495310-config\") pod \"dnsmasq-dns-78dd6ddcc-rgz7z\" (UID: \"d5c34809-17e3-41c0-85f5-a8d427495310\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.871064 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjbw6\" (UniqueName: \"kubernetes.io/projected/d5c34809-17e3-41c0-85f5-a8d427495310-kube-api-access-vjbw6\") pod \"dnsmasq-dns-78dd6ddcc-rgz7z\" (UID: \"d5c34809-17e3-41c0-85f5-a8d427495310\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" Feb 25 16:09:48 crc kubenswrapper[4937]: I0225 16:09:48.888924 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" Feb 25 16:09:49 crc kubenswrapper[4937]: I0225 16:09:49.320297 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c6qp7"] Feb 25 16:09:49 crc kubenswrapper[4937]: W0225 16:09:49.332112 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c5666a_48ac_4d24_9539_bc255bd0ef8a.slice/crio-f7b043486d0fe0cee77da83d48d8650fc18bd858480f072ff3ea8cc3aaaf5c04 WatchSource:0}: Error finding container f7b043486d0fe0cee77da83d48d8650fc18bd858480f072ff3ea8cc3aaaf5c04: Status 404 returned error can't find the container with id f7b043486d0fe0cee77da83d48d8650fc18bd858480f072ff3ea8cc3aaaf5c04 Feb 25 16:09:49 crc kubenswrapper[4937]: I0225 16:09:49.387474 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rgz7z"] Feb 25 16:09:49 crc kubenswrapper[4937]: W0225 16:09:49.390306 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5c34809_17e3_41c0_85f5_a8d427495310.slice/crio-3860b83495f17b8ae61afe52d097d4d66019e8f5965a24d7a6d227b7641394e5 WatchSource:0}: Error finding container 3860b83495f17b8ae61afe52d097d4d66019e8f5965a24d7a6d227b7641394e5: Status 404 returned error can't find the container with id 3860b83495f17b8ae61afe52d097d4d66019e8f5965a24d7a6d227b7641394e5 Feb 25 16:09:49 crc kubenswrapper[4937]: I0225 16:09:49.848588 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-c6qp7" event={"ID":"f7c5666a-48ac-4d24-9539-bc255bd0ef8a","Type":"ContainerStarted","Data":"f7b043486d0fe0cee77da83d48d8650fc18bd858480f072ff3ea8cc3aaaf5c04"} Feb 25 16:09:49 crc kubenswrapper[4937]: I0225 16:09:49.849813 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" event={"ID":"d5c34809-17e3-41c0-85f5-a8d427495310","Type":"ContainerStarted","Data":"3860b83495f17b8ae61afe52d097d4d66019e8f5965a24d7a6d227b7641394e5"} Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.500179 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c6qp7"] Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.530902 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-72jcd"] Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.532615 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-72jcd" Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.543048 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-72jcd"] Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.679623 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f1b597-5189-495f-bd75-69d65853615e-config\") pod \"dnsmasq-dns-666b6646f7-72jcd\" (UID: \"c4f1b597-5189-495f-bd75-69d65853615e\") " pod="openstack/dnsmasq-dns-666b6646f7-72jcd" Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.679976 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wpz9\" (UniqueName: \"kubernetes.io/projected/c4f1b597-5189-495f-bd75-69d65853615e-kube-api-access-8wpz9\") pod \"dnsmasq-dns-666b6646f7-72jcd\" (UID: \"c4f1b597-5189-495f-bd75-69d65853615e\") " pod="openstack/dnsmasq-dns-666b6646f7-72jcd" Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.680024 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4f1b597-5189-495f-bd75-69d65853615e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-72jcd\" (UID: \"c4f1b597-5189-495f-bd75-69d65853615e\") " pod="openstack/dnsmasq-dns-666b6646f7-72jcd" Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.783272 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wpz9\" (UniqueName: \"kubernetes.io/projected/c4f1b597-5189-495f-bd75-69d65853615e-kube-api-access-8wpz9\") pod \"dnsmasq-dns-666b6646f7-72jcd\" (UID: \"c4f1b597-5189-495f-bd75-69d65853615e\") " pod="openstack/dnsmasq-dns-666b6646f7-72jcd" Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.783338 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4f1b597-5189-495f-bd75-69d65853615e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-72jcd\" (UID: \"c4f1b597-5189-495f-bd75-69d65853615e\") " pod="openstack/dnsmasq-dns-666b6646f7-72jcd" Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.783407 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f1b597-5189-495f-bd75-69d65853615e-config\") pod \"dnsmasq-dns-666b6646f7-72jcd\" (UID: \"c4f1b597-5189-495f-bd75-69d65853615e\") " pod="openstack/dnsmasq-dns-666b6646f7-72jcd" Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.785079 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f1b597-5189-495f-bd75-69d65853615e-config\") pod \"dnsmasq-dns-666b6646f7-72jcd\" (UID: \"c4f1b597-5189-495f-bd75-69d65853615e\") " pod="openstack/dnsmasq-dns-666b6646f7-72jcd" Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.785810 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4f1b597-5189-495f-bd75-69d65853615e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-72jcd\" (UID: \"c4f1b597-5189-495f-bd75-69d65853615e\") " pod="openstack/dnsmasq-dns-666b6646f7-72jcd" Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.804519 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rgz7z"] Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.831266 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vdlq8"] Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.832335 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.837906 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wpz9\" (UniqueName: \"kubernetes.io/projected/c4f1b597-5189-495f-bd75-69d65853615e-kube-api-access-8wpz9\") pod \"dnsmasq-dns-666b6646f7-72jcd\" (UID: \"c4f1b597-5189-495f-bd75-69d65853615e\") " pod="openstack/dnsmasq-dns-666b6646f7-72jcd" Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.866873 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vdlq8"] Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.883726 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-72jcd" Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.989863 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phkz6\" (UniqueName: \"kubernetes.io/projected/cc0714eb-ff24-4699-af32-8e894985faa4-kube-api-access-phkz6\") pod \"dnsmasq-dns-57d769cc4f-vdlq8\" (UID: \"cc0714eb-ff24-4699-af32-8e894985faa4\") " pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.989996 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc0714eb-ff24-4699-af32-8e894985faa4-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vdlq8\" (UID: \"cc0714eb-ff24-4699-af32-8e894985faa4\") " pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" Feb 25 16:09:51 crc kubenswrapper[4937]: I0225 16:09:51.990027 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0714eb-ff24-4699-af32-8e894985faa4-config\") pod \"dnsmasq-dns-57d769cc4f-vdlq8\" (UID: \"cc0714eb-ff24-4699-af32-8e894985faa4\") " pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.091444 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phkz6\" (UniqueName: \"kubernetes.io/projected/cc0714eb-ff24-4699-af32-8e894985faa4-kube-api-access-phkz6\") pod \"dnsmasq-dns-57d769cc4f-vdlq8\" (UID: \"cc0714eb-ff24-4699-af32-8e894985faa4\") " pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.091564 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc0714eb-ff24-4699-af32-8e894985faa4-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vdlq8\" (UID: \"cc0714eb-ff24-4699-af32-8e894985faa4\") " pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.092962 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc0714eb-ff24-4699-af32-8e894985faa4-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vdlq8\" (UID: \"cc0714eb-ff24-4699-af32-8e894985faa4\") " pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.093015 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0714eb-ff24-4699-af32-8e894985faa4-config\") pod \"dnsmasq-dns-57d769cc4f-vdlq8\" (UID: \"cc0714eb-ff24-4699-af32-8e894985faa4\") " pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.093230 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0714eb-ff24-4699-af32-8e894985faa4-config\") pod \"dnsmasq-dns-57d769cc4f-vdlq8\" (UID: \"cc0714eb-ff24-4699-af32-8e894985faa4\") " pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.124383 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phkz6\" (UniqueName: \"kubernetes.io/projected/cc0714eb-ff24-4699-af32-8e894985faa4-kube-api-access-phkz6\") pod \"dnsmasq-dns-57d769cc4f-vdlq8\" (UID: \"cc0714eb-ff24-4699-af32-8e894985faa4\") " pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.189806 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.515601 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-72jcd"] Feb 25 16:09:52 crc kubenswrapper[4937]: W0225 16:09:52.531605 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4f1b597_5189_495f_bd75_69d65853615e.slice/crio-014779d44658eaa3c0675474ac2351b7101b077b742e35f7c9fe152e72461637 WatchSource:0}: Error finding container 014779d44658eaa3c0675474ac2351b7101b077b742e35f7c9fe152e72461637: Status 404 returned error can't find the container with id 014779d44658eaa3c0675474ac2351b7101b077b742e35f7c9fe152e72461637 Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.650432 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.651537 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.654570 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.654666 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.654818 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.666412 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.666945 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8rn57" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.666526 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.668916 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.731340 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.767065 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vdlq8"] Feb 25 16:09:52 crc kubenswrapper[4937]: W0225 16:09:52.781633 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc0714eb_ff24_4699_af32_8e894985faa4.slice/crio-4c21b269bc7d9de3e89b502ab7ecea74b47b879c6efc5f8faf072cc38bc85ba8 WatchSource:0}: Error finding container 4c21b269bc7d9de3e89b502ab7ecea74b47b879c6efc5f8faf072cc38bc85ba8: Status 404 returned error can't find the container with id 4c21b269bc7d9de3e89b502ab7ecea74b47b879c6efc5f8faf072cc38bc85ba8 Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.810849 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6895z\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-kube-api-access-6895z\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.810904 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9ebad40-444e-4250-85cb-2a154282cdf9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.810931 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.811000 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.811028 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.811063 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.811108 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9ebad40-444e-4250-85cb-2a154282cdf9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.811124 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-config-data\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.811157 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.811181 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.811197 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.885199 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-72jcd" event={"ID":"c4f1b597-5189-495f-bd75-69d65853615e","Type":"ContainerStarted","Data":"014779d44658eaa3c0675474ac2351b7101b077b742e35f7c9fe152e72461637"} Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.886513 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" event={"ID":"cc0714eb-ff24-4699-af32-8e894985faa4","Type":"ContainerStarted","Data":"4c21b269bc7d9de3e89b502ab7ecea74b47b879c6efc5f8faf072cc38bc85ba8"} Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.914573 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.914621 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.914664 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.914705 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9ebad40-444e-4250-85cb-2a154282cdf9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.914724 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-config-data\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.914761 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.914789 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.914805 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.914824 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6895z\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-kube-api-access-6895z\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.914848 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9ebad40-444e-4250-85cb-2a154282cdf9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.914872 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.916441 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.920213 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.921589 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.922596 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.924092 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-config-data\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.925840 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.925886 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/da5caa1ff1e373df82c469164e0fcbd65bc5959bf5874e46dcc56bb65d0f7f87/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.926347 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9ebad40-444e-4250-85cb-2a154282cdf9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.927002 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9ebad40-444e-4250-85cb-2a154282cdf9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.928616 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.929093 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.946292 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6895z\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-kube-api-access-6895z\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.965892 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.967319 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.975371 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lw4h5" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.976975 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.977154 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.977151 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.977323 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.977408 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.977479 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\") pod \"rabbitmq-server-0\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " pod="openstack/rabbitmq-server-0" Feb 25 16:09:52 crc kubenswrapper[4937]: I0225 16:09:52.986119 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:52.995194 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.000793 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.117278 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5fz6\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-kube-api-access-n5fz6\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.117392 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.117502 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b47931ca-1102-49a2-a86d-b68c8818831a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47931ca-1102-49a2-a86d-b68c8818831a\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.117539 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.117567 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de5b4144-33d4-4860-9872-8826c78490a7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.117593 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de5b4144-33d4-4860-9872-8826c78490a7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.117627 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.117663 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.117679 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.117714 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.117749 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.219954 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.220611 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.220626 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.220631 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.220747 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.220796 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.220901 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5fz6\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-kube-api-access-n5fz6\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.221348 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.221454 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.221875 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.222076 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.222151 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.222235 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b47931ca-1102-49a2-a86d-b68c8818831a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47931ca-1102-49a2-a86d-b68c8818831a\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.222294 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.222330 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de5b4144-33d4-4860-9872-8826c78490a7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.222703 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de5b4144-33d4-4860-9872-8826c78490a7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.226119 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.229367 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.230077 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.230133 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b47931ca-1102-49a2-a86d-b68c8818831a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47931ca-1102-49a2-a86d-b68c8818831a\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fd27c87e194eec4e867ab6a57fac5224d596b3e0e40be0379bb0a1ae088b7613/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.235308 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de5b4144-33d4-4860-9872-8826c78490a7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.235402 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de5b4144-33d4-4860-9872-8826c78490a7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.242911 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5fz6\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-kube-api-access-n5fz6\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.297890 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b47931ca-1102-49a2-a86d-b68c8818831a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47931ca-1102-49a2-a86d-b68c8818831a\") pod \"rabbitmq-cell1-server-0\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.310227 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.656192 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.836335 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 16:09:53 crc kubenswrapper[4937]: W0225 16:09:53.852335 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde5b4144_33d4_4860_9872_8826c78490a7.slice/crio-bb534c50e734e38b97ebbded5f5e0bc8937f316e5238e2574bd2ccbcee2d5a97 WatchSource:0}: Error finding container bb534c50e734e38b97ebbded5f5e0bc8937f316e5238e2574bd2ccbcee2d5a97: Status 404 returned error can't find the container with id bb534c50e734e38b97ebbded5f5e0bc8937f316e5238e2574bd2ccbcee2d5a97 Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.906227 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"de5b4144-33d4-4860-9872-8826c78490a7","Type":"ContainerStarted","Data":"bb534c50e734e38b97ebbded5f5e0bc8937f316e5238e2574bd2ccbcee2d5a97"} Feb 25 16:09:53 crc kubenswrapper[4937]: I0225 16:09:53.908263 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9ebad40-444e-4250-85cb-2a154282cdf9","Type":"ContainerStarted","Data":"394ef23ce490ab50b87b6c8cd5665568dc62a47704fc244bd5c73b58d063cc3a"} Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.044536 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.051208 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.053581 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.053735 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.053812 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bc6sq" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.057315 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.057331 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.071496 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.150865 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2aaa7928-dcc3-47ad-95b1-78dd863786af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2aaa7928-dcc3-47ad-95b1-78dd863786af\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.150921 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnqh6\" (UniqueName: \"kubernetes.io/projected/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-kube-api-access-lnqh6\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.150952 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.150983 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.151025 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-config-data-default\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.151077 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-kolla-config\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.151106 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.151175 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.252109 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.252164 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnqh6\" (UniqueName: \"kubernetes.io/projected/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-kube-api-access-lnqh6\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.252186 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2aaa7928-dcc3-47ad-95b1-78dd863786af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2aaa7928-dcc3-47ad-95b1-78dd863786af\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.252207 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.252225 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.252256 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-config-data-default\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.252293 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-kolla-config\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.252316 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.253466 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.254838 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-kolla-config\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.254902 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-config-data-default\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.255703 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.258092 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.258132 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2aaa7928-dcc3-47ad-95b1-78dd863786af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2aaa7928-dcc3-47ad-95b1-78dd863786af\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/49d6ce7901c3e62e200379f16836d713ceb9939aa0c4be822cda26481b400248/globalmount\"" pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.259208 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.272833 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.276367 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnqh6\" (UniqueName: \"kubernetes.io/projected/e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe-kube-api-access-lnqh6\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.323605 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2aaa7928-dcc3-47ad-95b1-78dd863786af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2aaa7928-dcc3-47ad-95b1-78dd863786af\") pod \"openstack-galera-0\" (UID: \"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe\") " pod="openstack/openstack-galera-0" Feb 25 16:09:54 crc kubenswrapper[4937]: I0225 16:09:54.386745 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.042264 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 25 16:09:55 crc kubenswrapper[4937]: W0225 16:09:55.047843 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0d3c0ee_079b_4bdc_9fdb_5d796b88b7fe.slice/crio-4ab08e4ee9ce85f04a9960228f59bca7c29a44a6161757813e8626f086d116c8 WatchSource:0}: Error finding container 4ab08e4ee9ce85f04a9960228f59bca7c29a44a6161757813e8626f086d116c8: Status 404 returned error can't find the container with id 4ab08e4ee9ce85f04a9960228f59bca7c29a44a6161757813e8626f086d116c8 Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.424057 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.426820 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.429829 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.431844 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.432629 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-5rnp7" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.436870 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.437833 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.587442 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-92244eba-d154-4124-a3fa-524212298638\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92244eba-d154-4124-a3fa-524212298638\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.587533 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2484d7-6d50-43d2-9105-e83280f565ac-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.587572 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9e2484d7-6d50-43d2-9105-e83280f565ac-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.587597 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl6vp\" (UniqueName: \"kubernetes.io/projected/9e2484d7-6d50-43d2-9105-e83280f565ac-kube-api-access-wl6vp\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.587619 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9e2484d7-6d50-43d2-9105-e83280f565ac-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.587664 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9e2484d7-6d50-43d2-9105-e83280f565ac-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.587692 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e2484d7-6d50-43d2-9105-e83280f565ac-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.587753 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2484d7-6d50-43d2-9105-e83280f565ac-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.688974 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e2484d7-6d50-43d2-9105-e83280f565ac-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.689046 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2484d7-6d50-43d2-9105-e83280f565ac-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.689085 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-92244eba-d154-4124-a3fa-524212298638\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92244eba-d154-4124-a3fa-524212298638\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.689121 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2484d7-6d50-43d2-9105-e83280f565ac-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.689149 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9e2484d7-6d50-43d2-9105-e83280f565ac-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.689179 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl6vp\" (UniqueName: \"kubernetes.io/projected/9e2484d7-6d50-43d2-9105-e83280f565ac-kube-api-access-wl6vp\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.689206 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9e2484d7-6d50-43d2-9105-e83280f565ac-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.689264 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9e2484d7-6d50-43d2-9105-e83280f565ac-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.690073 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9e2484d7-6d50-43d2-9105-e83280f565ac-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.690303 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9e2484d7-6d50-43d2-9105-e83280f565ac-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.690546 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9e2484d7-6d50-43d2-9105-e83280f565ac-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.690809 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e2484d7-6d50-43d2-9105-e83280f565ac-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.693133 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.693171 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-92244eba-d154-4124-a3fa-524212298638\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92244eba-d154-4124-a3fa-524212298638\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dd41367fc1d9f14918dbbf3ea24880f8ea633f3e3488c5ee444ea9e1de7648ee/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.705815 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2484d7-6d50-43d2-9105-e83280f565ac-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.718940 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2484d7-6d50-43d2-9105-e83280f565ac-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.739028 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl6vp\" (UniqueName: \"kubernetes.io/projected/9e2484d7-6d50-43d2-9105-e83280f565ac-kube-api-access-wl6vp\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.764706 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.765643 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.770020 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nw5h7" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.770171 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.770296 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.772816 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.777159 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-92244eba-d154-4124-a3fa-524212298638\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92244eba-d154-4124-a3fa-524212298638\") pod \"openstack-cell1-galera-0\" (UID: \"9e2484d7-6d50-43d2-9105-e83280f565ac\") " pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.892357 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ddd71fd-4c47-4357-87e7-16a2010a23df-config-data\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.892446 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ddd71fd-4c47-4357-87e7-16a2010a23df-kolla-config\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.892511 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ddd71fd-4c47-4357-87e7-16a2010a23df-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.892537 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ddd71fd-4c47-4357-87e7-16a2010a23df-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.892597 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnrdg\" (UniqueName: \"kubernetes.io/projected/2ddd71fd-4c47-4357-87e7-16a2010a23df-kube-api-access-jnrdg\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.936236 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe","Type":"ContainerStarted","Data":"4ab08e4ee9ce85f04a9960228f59bca7c29a44a6161757813e8626f086d116c8"} Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.994374 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ddd71fd-4c47-4357-87e7-16a2010a23df-config-data\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.995073 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ddd71fd-4c47-4357-87e7-16a2010a23df-kolla-config\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.995093 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ddd71fd-4c47-4357-87e7-16a2010a23df-config-data\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.995151 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ddd71fd-4c47-4357-87e7-16a2010a23df-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.995299 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ddd71fd-4c47-4357-87e7-16a2010a23df-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.995337 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnrdg\" (UniqueName: \"kubernetes.io/projected/2ddd71fd-4c47-4357-87e7-16a2010a23df-kube-api-access-jnrdg\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.995849 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ddd71fd-4c47-4357-87e7-16a2010a23df-kolla-config\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.998082 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ddd71fd-4c47-4357-87e7-16a2010a23df-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:55 crc kubenswrapper[4937]: I0225 16:09:55.998679 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ddd71fd-4c47-4357-87e7-16a2010a23df-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:56 crc kubenswrapper[4937]: I0225 16:09:56.020436 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnrdg\" (UniqueName: \"kubernetes.io/projected/2ddd71fd-4c47-4357-87e7-16a2010a23df-kube-api-access-jnrdg\") pod \"memcached-0\" (UID: \"2ddd71fd-4c47-4357-87e7-16a2010a23df\") " pod="openstack/memcached-0" Feb 25 16:09:56 crc kubenswrapper[4937]: I0225 16:09:56.047850 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 25 16:09:56 crc kubenswrapper[4937]: I0225 16:09:56.153868 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 25 16:09:57 crc kubenswrapper[4937]: I0225 16:09:56.564367 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 25 16:09:57 crc kubenswrapper[4937]: W0225 16:09:56.567583 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e2484d7_6d50_43d2_9105_e83280f565ac.slice/crio-83ab6b0b96dd4f3b759214cda7a38ccd50a6c440538984e5088fe3007e3b07c4 WatchSource:0}: Error finding container 83ab6b0b96dd4f3b759214cda7a38ccd50a6c440538984e5088fe3007e3b07c4: Status 404 returned error can't find the container with id 83ab6b0b96dd4f3b759214cda7a38ccd50a6c440538984e5088fe3007e3b07c4 Feb 25 16:09:57 crc kubenswrapper[4937]: I0225 16:09:56.691304 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 25 16:09:57 crc kubenswrapper[4937]: W0225 16:09:56.696011 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ddd71fd_4c47_4357_87e7_16a2010a23df.slice/crio-bf848acb8309b3d4ec7a38f71035ae1dedc822aa32c98fd555bb2fc998c9876c WatchSource:0}: Error finding container bf848acb8309b3d4ec7a38f71035ae1dedc822aa32c98fd555bb2fc998c9876c: Status 404 returned error can't find the container with id bf848acb8309b3d4ec7a38f71035ae1dedc822aa32c98fd555bb2fc998c9876c Feb 25 16:09:57 crc kubenswrapper[4937]: I0225 16:09:56.952467 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2ddd71fd-4c47-4357-87e7-16a2010a23df","Type":"ContainerStarted","Data":"bf848acb8309b3d4ec7a38f71035ae1dedc822aa32c98fd555bb2fc998c9876c"} Feb 25 16:09:57 crc kubenswrapper[4937]: I0225 16:09:56.960761 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9e2484d7-6d50-43d2-9105-e83280f565ac","Type":"ContainerStarted","Data":"83ab6b0b96dd4f3b759214cda7a38ccd50a6c440538984e5088fe3007e3b07c4"} Feb 25 16:09:57 crc kubenswrapper[4937]: I0225 16:09:57.852365 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 16:09:57 crc kubenswrapper[4937]: I0225 16:09:57.853989 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 16:09:57 crc kubenswrapper[4937]: I0225 16:09:57.868009 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 16:09:57 crc kubenswrapper[4937]: I0225 16:09:57.869551 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-hk2th" Feb 25 16:09:57 crc kubenswrapper[4937]: I0225 16:09:57.932526 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw7gh\" (UniqueName: \"kubernetes.io/projected/d34734ba-2195-4ea2-aa76-654c3c85a206-kube-api-access-dw7gh\") pod \"kube-state-metrics-0\" (UID: \"d34734ba-2195-4ea2-aa76-654c3c85a206\") " pod="openstack/kube-state-metrics-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.033920 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw7gh\" (UniqueName: \"kubernetes.io/projected/d34734ba-2195-4ea2-aa76-654c3c85a206-kube-api-access-dw7gh\") pod \"kube-state-metrics-0\" (UID: \"d34734ba-2195-4ea2-aa76-654c3c85a206\") " pod="openstack/kube-state-metrics-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.086425 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw7gh\" (UniqueName: \"kubernetes.io/projected/d34734ba-2195-4ea2-aa76-654c3c85a206-kube-api-access-dw7gh\") pod \"kube-state-metrics-0\" (UID: \"d34734ba-2195-4ea2-aa76-654c3c85a206\") " pod="openstack/kube-state-metrics-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.376068 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.518357 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.520646 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.522899 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.523060 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.523203 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.523449 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.525600 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-94hc2" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.547077 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.649407 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d773f4d2-bec3-4379-a7a2-29975a18c85b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.649469 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d773f4d2-bec3-4379-a7a2-29975a18c85b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.649591 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d773f4d2-bec3-4379-a7a2-29975a18c85b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.649711 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d773f4d2-bec3-4379-a7a2-29975a18c85b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.649778 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d773f4d2-bec3-4379-a7a2-29975a18c85b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.649932 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d773f4d2-bec3-4379-a7a2-29975a18c85b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.650028 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwzmr\" (UniqueName: \"kubernetes.io/projected/d773f4d2-bec3-4379-a7a2-29975a18c85b-kube-api-access-hwzmr\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.750960 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d773f4d2-bec3-4379-a7a2-29975a18c85b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.751014 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d773f4d2-bec3-4379-a7a2-29975a18c85b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.751063 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d773f4d2-bec3-4379-a7a2-29975a18c85b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.751097 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwzmr\" (UniqueName: \"kubernetes.io/projected/d773f4d2-bec3-4379-a7a2-29975a18c85b-kube-api-access-hwzmr\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.751123 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d773f4d2-bec3-4379-a7a2-29975a18c85b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.751146 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d773f4d2-bec3-4379-a7a2-29975a18c85b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.751192 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d773f4d2-bec3-4379-a7a2-29975a18c85b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.752092 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d773f4d2-bec3-4379-a7a2-29975a18c85b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.755449 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d773f4d2-bec3-4379-a7a2-29975a18c85b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.757443 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d773f4d2-bec3-4379-a7a2-29975a18c85b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.757518 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d773f4d2-bec3-4379-a7a2-29975a18c85b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.770154 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwzmr\" (UniqueName: \"kubernetes.io/projected/d773f4d2-bec3-4379-a7a2-29975a18c85b-kube-api-access-hwzmr\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.770322 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d773f4d2-bec3-4379-a7a2-29975a18c85b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.792398 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d773f4d2-bec3-4379-a7a2-29975a18c85b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"d773f4d2-bec3-4379-a7a2-29975a18c85b\") " pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:58 crc kubenswrapper[4937]: I0225 16:09:58.848181 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.076286 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.090026 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.096165 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.096391 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.096650 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-8sz4b" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.096835 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.096974 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.097103 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.097233 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.104180 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.120939 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.173749 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52qj2\" (UniqueName: \"kubernetes.io/projected/9f90fdcc-629f-46e9-9485-de80d43ea155-kube-api-access-52qj2\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.173814 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f90fdcc-629f-46e9-9485-de80d43ea155-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.173858 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.174123 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.174208 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.174253 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.174319 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-config\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.174343 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdcc-629f-46e9-9485-de80d43ea155-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.174362 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.174415 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.276823 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-config\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.276875 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdcc-629f-46e9-9485-de80d43ea155-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.276894 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.276940 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.276965 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52qj2\" (UniqueName: \"kubernetes.io/projected/9f90fdcc-629f-46e9-9485-de80d43ea155-kube-api-access-52qj2\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.277009 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f90fdcc-629f-46e9-9485-de80d43ea155-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.277038 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.277084 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.277111 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.277137 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.277717 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.278110 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.278638 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.296005 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.296117 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-config\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.296162 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.296209 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f90fdcc-629f-46e9-9485-de80d43ea155-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.296727 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdcc-629f-46e9-9485-de80d43ea155-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.307197 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52qj2\" (UniqueName: \"kubernetes.io/projected/9f90fdcc-629f-46e9-9485-de80d43ea155-kube-api-access-52qj2\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.320687 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.320736 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bcbb07c66c16a8bfff5bebc3b08f557f1544f8693883973b0b05074d97af7e5f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.359396 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") pod \"prometheus-metric-storage-0\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:09:59 crc kubenswrapper[4937]: I0225 16:09:59.460038 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:00 crc kubenswrapper[4937]: I0225 16:10:00.130933 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533930-xjhtn"] Feb 25 16:10:00 crc kubenswrapper[4937]: I0225 16:10:00.138505 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533930-xjhtn" Feb 25 16:10:00 crc kubenswrapper[4937]: I0225 16:10:00.142139 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:10:00 crc kubenswrapper[4937]: I0225 16:10:00.142554 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:10:00 crc kubenswrapper[4937]: I0225 16:10:00.142719 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:10:00 crc kubenswrapper[4937]: I0225 16:10:00.148258 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533930-xjhtn"] Feb 25 16:10:00 crc kubenswrapper[4937]: I0225 16:10:00.200556 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbswk\" (UniqueName: \"kubernetes.io/projected/005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8-kube-api-access-wbswk\") pod \"auto-csr-approver-29533930-xjhtn\" (UID: \"005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8\") " pod="openshift-infra/auto-csr-approver-29533930-xjhtn" Feb 25 16:10:00 crc kubenswrapper[4937]: I0225 16:10:00.303033 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbswk\" (UniqueName: \"kubernetes.io/projected/005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8-kube-api-access-wbswk\") pod \"auto-csr-approver-29533930-xjhtn\" (UID: \"005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8\") " pod="openshift-infra/auto-csr-approver-29533930-xjhtn" Feb 25 16:10:00 crc kubenswrapper[4937]: I0225 16:10:00.320686 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbswk\" (UniqueName: \"kubernetes.io/projected/005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8-kube-api-access-wbswk\") pod \"auto-csr-approver-29533930-xjhtn\" (UID: \"005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8\") " pod="openshift-infra/auto-csr-approver-29533930-xjhtn" Feb 25 16:10:00 crc kubenswrapper[4937]: I0225 16:10:00.462252 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533930-xjhtn" Feb 25 16:10:00 crc kubenswrapper[4937]: I0225 16:10:00.522194 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.070094 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sgdhb"] Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.072035 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.074363 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-7hq94" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.074401 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.079795 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.097635 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sgdhb"] Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.132188 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54nn4\" (UniqueName: \"kubernetes.io/projected/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-kube-api-access-54nn4\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.132242 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-ovn-controller-tls-certs\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.132267 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-var-run-ovn\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.132302 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-combined-ca-bundle\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.132337 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-scripts\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.132355 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-var-run\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.132752 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-var-log-ovn\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.150050 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-4rsxl"] Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.152087 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.161141 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4rsxl"] Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.234705 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/388f0d04-d580-46ae-a729-667d81ad11a0-var-lib\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.234745 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/388f0d04-d580-46ae-a729-667d81ad11a0-scripts\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.234802 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-var-run-ovn\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.234917 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnlf9\" (UniqueName: \"kubernetes.io/projected/388f0d04-d580-46ae-a729-667d81ad11a0-kube-api-access-pnlf9\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.234999 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-combined-ca-bundle\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.235062 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/388f0d04-d580-46ae-a729-667d81ad11a0-var-log\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.235086 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-scripts\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.235112 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-var-run\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.235178 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/388f0d04-d580-46ae-a729-667d81ad11a0-etc-ovs\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.235220 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-var-log-ovn\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.235221 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-var-run-ovn\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.236030 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-var-log-ovn\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.236193 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-var-run\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.236316 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/388f0d04-d580-46ae-a729-667d81ad11a0-var-run\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.236445 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54nn4\" (UniqueName: \"kubernetes.io/projected/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-kube-api-access-54nn4\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.236524 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-ovn-controller-tls-certs\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.237181 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-scripts\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.239501 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-combined-ca-bundle\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.239762 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-ovn-controller-tls-certs\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.253390 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54nn4\" (UniqueName: \"kubernetes.io/projected/c0b0baed-3140-4ac4-9d27-e8fc15c390c2-kube-api-access-54nn4\") pod \"ovn-controller-sgdhb\" (UID: \"c0b0baed-3140-4ac4-9d27-e8fc15c390c2\") " pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.337812 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/388f0d04-d580-46ae-a729-667d81ad11a0-etc-ovs\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.337885 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/388f0d04-d580-46ae-a729-667d81ad11a0-var-run\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.337946 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/388f0d04-d580-46ae-a729-667d81ad11a0-var-lib\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.337967 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/388f0d04-d580-46ae-a729-667d81ad11a0-scripts\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.338018 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnlf9\" (UniqueName: \"kubernetes.io/projected/388f0d04-d580-46ae-a729-667d81ad11a0-kube-api-access-pnlf9\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.338072 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/388f0d04-d580-46ae-a729-667d81ad11a0-var-log\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.338128 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/388f0d04-d580-46ae-a729-667d81ad11a0-var-run\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.338236 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/388f0d04-d580-46ae-a729-667d81ad11a0-var-lib\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.338307 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/388f0d04-d580-46ae-a729-667d81ad11a0-var-log\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.338520 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/388f0d04-d580-46ae-a729-667d81ad11a0-etc-ovs\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.340247 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/388f0d04-d580-46ae-a729-667d81ad11a0-scripts\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.358227 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnlf9\" (UniqueName: \"kubernetes.io/projected/388f0d04-d580-46ae-a729-667d81ad11a0-kube-api-access-pnlf9\") pod \"ovn-controller-ovs-4rsxl\" (UID: \"388f0d04-d580-46ae-a729-667d81ad11a0\") " pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.389257 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:02 crc kubenswrapper[4937]: I0225 16:10:02.469579 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.783136 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.785180 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.796937 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.797010 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.797236 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-g7h66" Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.797363 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.801019 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.801953 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.977036 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.977111 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.977141 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzthh\" (UniqueName: \"kubernetes.io/projected/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-kube-api-access-fzthh\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.977352 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-709af074-a6b0-4983-b7cf-61015a8e1ece\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-709af074-a6b0-4983-b7cf-61015a8e1ece\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.977577 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.977633 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.977733 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:04 crc kubenswrapper[4937]: I0225 16:10:04.977963 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-config\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.079643 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzthh\" (UniqueName: \"kubernetes.io/projected/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-kube-api-access-fzthh\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.079703 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-709af074-a6b0-4983-b7cf-61015a8e1ece\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-709af074-a6b0-4983-b7cf-61015a8e1ece\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.079766 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.079783 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.079805 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.079843 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-config\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.079863 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.079917 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.080314 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.081299 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-config\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.081333 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.087282 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.088342 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.093360 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.100451 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.100506 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-709af074-a6b0-4983-b7cf-61015a8e1ece\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-709af074-a6b0-4983-b7cf-61015a8e1ece\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4508cfba0a71e3b9317924f918d913de1709266482cc1f20682004f9d9b5d31e/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.105390 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzthh\" (UniqueName: \"kubernetes.io/projected/a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08-kube-api-access-fzthh\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.136752 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-709af074-a6b0-4983-b7cf-61015a8e1ece\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-709af074-a6b0-4983-b7cf-61015a8e1ece\") pod \"ovsdbserver-sb-0\" (UID: \"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08\") " pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:05 crc kubenswrapper[4937]: I0225 16:10:05.402926 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.069157 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.070669 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.077886 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.078043 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.078732 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-z7xr6" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.078882 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.083456 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.228266 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a044be7-a58d-4684-8252-5a850694fb04-config\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.228333 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pbmv\" (UniqueName: \"kubernetes.io/projected/0a044be7-a58d-4684-8252-5a850694fb04-kube-api-access-6pbmv\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.228375 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0a044be7-a58d-4684-8252-5a850694fb04-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.228399 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a044be7-a58d-4684-8252-5a850694fb04-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.228421 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a044be7-a58d-4684-8252-5a850694fb04-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.228435 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a044be7-a58d-4684-8252-5a850694fb04-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.228470 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5185bdbf-211e-4050-ab69-2a7c46cb47f9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5185bdbf-211e-4050-ab69-2a7c46cb47f9\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.228507 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a044be7-a58d-4684-8252-5a850694fb04-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.329406 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a044be7-a58d-4684-8252-5a850694fb04-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.329455 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a044be7-a58d-4684-8252-5a850694fb04-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.329512 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5185bdbf-211e-4050-ab69-2a7c46cb47f9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5185bdbf-211e-4050-ab69-2a7c46cb47f9\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.329535 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a044be7-a58d-4684-8252-5a850694fb04-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.329651 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a044be7-a58d-4684-8252-5a850694fb04-config\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.330535 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pbmv\" (UniqueName: \"kubernetes.io/projected/0a044be7-a58d-4684-8252-5a850694fb04-kube-api-access-6pbmv\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.330575 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0a044be7-a58d-4684-8252-5a850694fb04-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.330597 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a044be7-a58d-4684-8252-5a850694fb04-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.331827 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a044be7-a58d-4684-8252-5a850694fb04-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.332314 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a044be7-a58d-4684-8252-5a850694fb04-config\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.332417 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0a044be7-a58d-4684-8252-5a850694fb04-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.335211 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a044be7-a58d-4684-8252-5a850694fb04-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.340438 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.340478 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5185bdbf-211e-4050-ab69-2a7c46cb47f9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5185bdbf-211e-4050-ab69-2a7c46cb47f9\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c3a49032bebfdad90b12bb617a99e4f5beb9823476599af1cb51de3b4a15f20d/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.345773 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pbmv\" (UniqueName: \"kubernetes.io/projected/0a044be7-a58d-4684-8252-5a850694fb04-kube-api-access-6pbmv\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.346588 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a044be7-a58d-4684-8252-5a850694fb04-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.346783 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a044be7-a58d-4684-8252-5a850694fb04-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.373247 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5185bdbf-211e-4050-ab69-2a7c46cb47f9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5185bdbf-211e-4050-ab69-2a7c46cb47f9\") pod \"ovsdbserver-nb-0\" (UID: \"0a044be7-a58d-4684-8252-5a850694fb04\") " pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:06 crc kubenswrapper[4937]: I0225 16:10:06.451013 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.182753 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7"] Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.184115 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.190632 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.190926 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.191136 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.191574 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-6vssh" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.191721 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.198581 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7"] Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.305258 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48e5f2c6-d4ed-48a1-8737-693b54c43613-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.305326 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48e5f2c6-d4ed-48a1-8737-693b54c43613-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.305367 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/48e5f2c6-d4ed-48a1-8737-693b54c43613-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.305436 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/48e5f2c6-d4ed-48a1-8737-693b54c43613-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.305469 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f8mb\" (UniqueName: \"kubernetes.io/projected/48e5f2c6-d4ed-48a1-8737-693b54c43613-kube-api-access-5f8mb\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.347980 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-86646"] Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.349113 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.353627 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.354707 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.356986 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.358623 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-86646"] Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.410784 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f72068e0-28e8-4c10-abeb-c067fe29c2f4-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.410832 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48e5f2c6-d4ed-48a1-8737-693b54c43613-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.410857 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48e5f2c6-d4ed-48a1-8737-693b54c43613-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.410879 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/f72068e0-28e8-4c10-abeb-c067fe29c2f4-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.410899 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/f72068e0-28e8-4c10-abeb-c067fe29c2f4-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.410917 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f72068e0-28e8-4c10-abeb-c067fe29c2f4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.410937 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/f72068e0-28e8-4c10-abeb-c067fe29c2f4-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.410956 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/48e5f2c6-d4ed-48a1-8737-693b54c43613-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.411270 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/48e5f2c6-d4ed-48a1-8737-693b54c43613-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.411370 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4zg8\" (UniqueName: \"kubernetes.io/projected/f72068e0-28e8-4c10-abeb-c067fe29c2f4-kube-api-access-p4zg8\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.411464 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f8mb\" (UniqueName: \"kubernetes.io/projected/48e5f2c6-d4ed-48a1-8737-693b54c43613-kube-api-access-5f8mb\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.411882 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48e5f2c6-d4ed-48a1-8737-693b54c43613-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.414292 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48e5f2c6-d4ed-48a1-8737-693b54c43613-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.424077 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/48e5f2c6-d4ed-48a1-8737-693b54c43613-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.427138 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/48e5f2c6-d4ed-48a1-8737-693b54c43613-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.452130 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f8mb\" (UniqueName: \"kubernetes.io/projected/48e5f2c6-d4ed-48a1-8737-693b54c43613-kube-api-access-5f8mb\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-mplc7\" (UID: \"48e5f2c6-d4ed-48a1-8737-693b54c43613\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.497697 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p"] Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.499172 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.499434 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p"] Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.505120 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.508287 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.511983 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.512203 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d351b94-5168-4f7f-9d70-c2cd2225dba8-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.512245 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4zg8\" (UniqueName: \"kubernetes.io/projected/f72068e0-28e8-4c10-abeb-c067fe29c2f4-kube-api-access-p4zg8\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.512291 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d351b94-5168-4f7f-9d70-c2cd2225dba8-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.512347 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/5d351b94-5168-4f7f-9d70-c2cd2225dba8-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.512371 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f72068e0-28e8-4c10-abeb-c067fe29c2f4-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.512404 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p87b2\" (UniqueName: \"kubernetes.io/projected/5d351b94-5168-4f7f-9d70-c2cd2225dba8-kube-api-access-p87b2\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.512443 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/f72068e0-28e8-4c10-abeb-c067fe29c2f4-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.512601 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/f72068e0-28e8-4c10-abeb-c067fe29c2f4-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.512634 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f72068e0-28e8-4c10-abeb-c067fe29c2f4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.512662 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/5d351b94-5168-4f7f-9d70-c2cd2225dba8-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.512696 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/f72068e0-28e8-4c10-abeb-c067fe29c2f4-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.515762 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f72068e0-28e8-4c10-abeb-c067fe29c2f4-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.518603 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f72068e0-28e8-4c10-abeb-c067fe29c2f4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.522236 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/f72068e0-28e8-4c10-abeb-c067fe29c2f4-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.522811 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/f72068e0-28e8-4c10-abeb-c067fe29c2f4-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.531368 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/f72068e0-28e8-4c10-abeb-c067fe29c2f4-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.541924 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4zg8\" (UniqueName: \"kubernetes.io/projected/f72068e0-28e8-4c10-abeb-c067fe29c2f4-kube-api-access-p4zg8\") pod \"cloudkitty-lokistack-querier-58c84b5844-86646\" (UID: \"f72068e0-28e8-4c10-abeb-c067fe29c2f4\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.666089 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.667254 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/5d351b94-5168-4f7f-9d70-c2cd2225dba8-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.667306 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p87b2\" (UniqueName: \"kubernetes.io/projected/5d351b94-5168-4f7f-9d70-c2cd2225dba8-kube-api-access-p87b2\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.667334 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/5d351b94-5168-4f7f-9d70-c2cd2225dba8-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.667421 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d351b94-5168-4f7f-9d70-c2cd2225dba8-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.667469 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d351b94-5168-4f7f-9d70-c2cd2225dba8-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.668223 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d351b94-5168-4f7f-9d70-c2cd2225dba8-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.675582 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d351b94-5168-4f7f-9d70-c2cd2225dba8-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.684701 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/5d351b94-5168-4f7f-9d70-c2cd2225dba8-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.686753 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/5d351b94-5168-4f7f-9d70-c2cd2225dba8-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.710731 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss"] Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.712103 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7"] Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.715882 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.719196 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.722866 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.723142 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.723263 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.723348 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.723390 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.724255 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss"] Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.726220 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.726957 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p87b2\" (UniqueName: \"kubernetes.io/projected/5d351b94-5168-4f7f-9d70-c2cd2225dba8-kube-api-access-p87b2\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p\" (UID: \"5d351b94-5168-4f7f-9d70-c2cd2225dba8\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.729254 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-slbc7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.736028 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7"] Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873137 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873180 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873202 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873452 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873528 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/836ae71a-cf0f-4a00-a0bc-78d1be68f830-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873578 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873614 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzjdz\" (UniqueName: \"kubernetes.io/projected/836ae71a-cf0f-4a00-a0bc-78d1be68f830-kube-api-access-wzjdz\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873636 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873656 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873695 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873713 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/836ae71a-cf0f-4a00-a0bc-78d1be68f830-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873729 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873746 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/836ae71a-cf0f-4a00-a0bc-78d1be68f830-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873793 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873825 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873882 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873915 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbv4c\" (UniqueName: \"kubernetes.io/projected/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-kube-api-access-xbv4c\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.873935 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.888967 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975134 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975194 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/836ae71a-cf0f-4a00-a0bc-78d1be68f830-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975237 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975259 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzjdz\" (UniqueName: \"kubernetes.io/projected/836ae71a-cf0f-4a00-a0bc-78d1be68f830-kube-api-access-wzjdz\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975282 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975297 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975330 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975346 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/836ae71a-cf0f-4a00-a0bc-78d1be68f830-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975373 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975389 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/836ae71a-cf0f-4a00-a0bc-78d1be68f830-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975429 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975449 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975510 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975544 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbv4c\" (UniqueName: \"kubernetes.io/projected/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-kube-api-access-xbv4c\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975570 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975595 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975611 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.975626 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.976927 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.977849 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.978909 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.978970 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.979822 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.980262 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.982137 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/836ae71a-cf0f-4a00-a0bc-78d1be68f830-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.983074 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.984029 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.984887 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/836ae71a-cf0f-4a00-a0bc-78d1be68f830-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.985776 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.986356 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/836ae71a-cf0f-4a00-a0bc-78d1be68f830-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.986059 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/836ae71a-cf0f-4a00-a0bc-78d1be68f830-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.988199 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.988875 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.990535 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.993225 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzjdz\" (UniqueName: \"kubernetes.io/projected/836ae71a-cf0f-4a00-a0bc-78d1be68f830-kube-api-access-wzjdz\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-v7lt7\" (UID: \"836ae71a-cf0f-4a00-a0bc-78d1be68f830\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:09 crc kubenswrapper[4937]: I0225 16:10:09.998638 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbv4c\" (UniqueName: \"kubernetes.io/projected/e13a5d7c-5a1a-466b-83a0-d76859e2cd3e-kube-api-access-xbv4c\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-t9wss\" (UID: \"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.041364 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.061503 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.386512 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.389995 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.392804 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.396640 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.396953 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.452678 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.453849 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.457774 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.459469 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.467271 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.497266 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.497314 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tgv9\" (UniqueName: \"kubernetes.io/projected/123b3439-f7ab-44b4-bbed-02539668cf80-kube-api-access-6tgv9\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.497364 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.497538 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/123b3439-f7ab-44b4-bbed-02539668cf80-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.497704 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/123b3439-f7ab-44b4-bbed-02539668cf80-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.497741 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/123b3439-f7ab-44b4-bbed-02539668cf80-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.497763 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/123b3439-f7ab-44b4-bbed-02539668cf80-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.497810 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/123b3439-f7ab-44b4-bbed-02539668cf80-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.566755 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.567933 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.570736 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.570981 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.585188 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599271 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599333 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08382e6d-e8e5-4656-a524-26c8269114fd-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599369 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tgv9\" (UniqueName: \"kubernetes.io/projected/123b3439-f7ab-44b4-bbed-02539668cf80-kube-api-access-6tgv9\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599410 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmwd7\" (UniqueName: \"kubernetes.io/projected/08382e6d-e8e5-4656-a524-26c8269114fd-kube-api-access-dmwd7\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599440 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/08382e6d-e8e5-4656-a524-26c8269114fd-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599498 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/08382e6d-e8e5-4656-a524-26c8269114fd-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599529 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599587 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/123b3439-f7ab-44b4-bbed-02539668cf80-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599633 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599664 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/08382e6d-e8e5-4656-a524-26c8269114fd-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599708 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/123b3439-f7ab-44b4-bbed-02539668cf80-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599744 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/123b3439-f7ab-44b4-bbed-02539668cf80-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599768 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/123b3439-f7ab-44b4-bbed-02539668cf80-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599796 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08382e6d-e8e5-4656-a524-26c8269114fd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599835 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/123b3439-f7ab-44b4-bbed-02539668cf80-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599849 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.599905 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.602433 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/123b3439-f7ab-44b4-bbed-02539668cf80-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.602563 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/123b3439-f7ab-44b4-bbed-02539668cf80-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.606763 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/123b3439-f7ab-44b4-bbed-02539668cf80-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.610237 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/123b3439-f7ab-44b4-bbed-02539668cf80-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.610931 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/123b3439-f7ab-44b4-bbed-02539668cf80-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.627989 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tgv9\" (UniqueName: \"kubernetes.io/projected/123b3439-f7ab-44b4-bbed-02539668cf80-kube-api-access-6tgv9\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.628383 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.632710 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"123b3439-f7ab-44b4-bbed-02539668cf80\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.701914 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4c8966b-44e5-42fd-ae20-d3099876ee36-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.701975 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6mq6\" (UniqueName: \"kubernetes.io/projected/c4c8966b-44e5-42fd-ae20-d3099876ee36-kube-api-access-k6mq6\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.702172 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/c4c8966b-44e5-42fd-ae20-d3099876ee36-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.702234 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08382e6d-e8e5-4656-a524-26c8269114fd-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.702286 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4c8966b-44e5-42fd-ae20-d3099876ee36-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.702310 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/c4c8966b-44e5-42fd-ae20-d3099876ee36-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.702340 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmwd7\" (UniqueName: \"kubernetes.io/projected/08382e6d-e8e5-4656-a524-26c8269114fd-kube-api-access-dmwd7\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.702357 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/08382e6d-e8e5-4656-a524-26c8269114fd-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.702412 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/08382e6d-e8e5-4656-a524-26c8269114fd-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.702650 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.702676 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.702708 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/08382e6d-e8e5-4656-a524-26c8269114fd-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.702753 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/c4c8966b-44e5-42fd-ae20-d3099876ee36-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.702805 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08382e6d-e8e5-4656-a524-26c8269114fd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.703506 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.704048 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08382e6d-e8e5-4656-a524-26c8269114fd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.705646 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08382e6d-e8e5-4656-a524-26c8269114fd-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.706743 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/08382e6d-e8e5-4656-a524-26c8269114fd-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.714068 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/08382e6d-e8e5-4656-a524-26c8269114fd-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.720215 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/08382e6d-e8e5-4656-a524-26c8269114fd-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.723538 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.725302 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmwd7\" (UniqueName: \"kubernetes.io/projected/08382e6d-e8e5-4656-a524-26c8269114fd-kube-api-access-dmwd7\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"08382e6d-e8e5-4656-a524-26c8269114fd\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.783852 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.803807 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.803860 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/c4c8966b-44e5-42fd-ae20-d3099876ee36-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.803912 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4c8966b-44e5-42fd-ae20-d3099876ee36-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.803940 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6mq6\" (UniqueName: \"kubernetes.io/projected/c4c8966b-44e5-42fd-ae20-d3099876ee36-kube-api-access-k6mq6\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.803968 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/c4c8966b-44e5-42fd-ae20-d3099876ee36-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.803993 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4c8966b-44e5-42fd-ae20-d3099876ee36-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.804011 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/c4c8966b-44e5-42fd-ae20-d3099876ee36-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.804604 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.805314 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4c8966b-44e5-42fd-ae20-d3099876ee36-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.806236 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4c8966b-44e5-42fd-ae20-d3099876ee36-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.807559 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/c4c8966b-44e5-42fd-ae20-d3099876ee36-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.807566 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/c4c8966b-44e5-42fd-ae20-d3099876ee36-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.809945 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/c4c8966b-44e5-42fd-ae20-d3099876ee36-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.820007 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6mq6\" (UniqueName: \"kubernetes.io/projected/c4c8966b-44e5-42fd-ae20-d3099876ee36-kube-api-access-k6mq6\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.827684 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c4c8966b-44e5-42fd-ae20-d3099876ee36\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:10 crc kubenswrapper[4937]: I0225 16:10:10.892819 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:11 crc kubenswrapper[4937]: I0225 16:10:11.007792 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:11 crc kubenswrapper[4937]: I0225 16:10:11.495283 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:10:11 crc kubenswrapper[4937]: I0225 16:10:11.495333 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:10:18 crc kubenswrapper[4937]: E0225 16:10:18.014738 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 25 16:10:18 crc kubenswrapper[4937]: E0225 16:10:18.015599 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vjbw6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-rgz7z_openstack(d5c34809-17e3-41c0-85f5-a8d427495310): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:10:18 crc kubenswrapper[4937]: E0225 16:10:18.017641 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" podUID="d5c34809-17e3-41c0-85f5-a8d427495310" Feb 25 16:10:18 crc kubenswrapper[4937]: E0225 16:10:18.027395 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 25 16:10:18 crc kubenswrapper[4937]: E0225 16:10:18.027709 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6chw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-c6qp7_openstack(f7c5666a-48ac-4d24-9539-bc255bd0ef8a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:10:18 crc kubenswrapper[4937]: E0225 16:10:18.029086 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-c6qp7" podUID="f7c5666a-48ac-4d24-9539-bc255bd0ef8a" Feb 25 16:10:18 crc kubenswrapper[4937]: E0225 16:10:18.088682 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 25 16:10:18 crc kubenswrapper[4937]: E0225 16:10:18.088960 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-phkz6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-vdlq8_openstack(cc0714eb-ff24-4699-af32-8e894985faa4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:10:18 crc kubenswrapper[4937]: E0225 16:10:18.090635 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" podUID="cc0714eb-ff24-4699-af32-8e894985faa4" Feb 25 16:10:18 crc kubenswrapper[4937]: E0225 16:10:18.109269 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 25 16:10:18 crc kubenswrapper[4937]: E0225 16:10:18.109477 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wpz9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-72jcd_openstack(c4f1b597-5189-495f-bd75-69d65853615e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:10:18 crc kubenswrapper[4937]: E0225 16:10:18.110687 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-72jcd" podUID="c4f1b597-5189-495f-bd75-69d65853615e" Feb 25 16:10:18 crc kubenswrapper[4937]: E0225 16:10:18.175849 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-72jcd" podUID="c4f1b597-5189-495f-bd75-69d65853615e" Feb 25 16:10:18 crc kubenswrapper[4937]: E0225 16:10:18.176329 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" podUID="cc0714eb-ff24-4699-af32-8e894985faa4" Feb 25 16:10:19 crc kubenswrapper[4937]: E0225 16:10:19.192079 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Feb 25 16:10:19 crc kubenswrapper[4937]: E0225 16:10:19.192516 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6895z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(b9ebad40-444e-4250-85cb-2a154282cdf9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:10:19 crc kubenswrapper[4937]: E0225 16:10:19.194342 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="b9ebad40-444e-4250-85cb-2a154282cdf9" Feb 25 16:10:19 crc kubenswrapper[4937]: E0225 16:10:19.209803 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Feb 25 16:10:19 crc kubenswrapper[4937]: E0225 16:10:19.210076 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n5fz6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(de5b4144-33d4-4860-9872-8826c78490a7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:10:19 crc kubenswrapper[4937]: E0225 16:10:19.211287 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="de5b4144-33d4-4860-9872-8826c78490a7" Feb 25 16:10:20 crc kubenswrapper[4937]: E0225 16:10:20.187768 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="de5b4144-33d4-4860-9872-8826c78490a7" Feb 25 16:10:20 crc kubenswrapper[4937]: E0225 16:10:20.187829 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="b9ebad40-444e-4250-85cb-2a154282cdf9" Feb 25 16:10:20 crc kubenswrapper[4937]: E0225 16:10:20.815972 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Feb 25 16:10:20 crc kubenswrapper[4937]: E0225 16:10:20.816186 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lnqh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:10:20 crc kubenswrapper[4937]: E0225 16:10:20.817759 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe" Feb 25 16:10:20 crc kubenswrapper[4937]: I0225 16:10:20.894917 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-c6qp7" Feb 25 16:10:20 crc kubenswrapper[4937]: I0225 16:10:20.901157 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.002912 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjbw6\" (UniqueName: \"kubernetes.io/projected/d5c34809-17e3-41c0-85f5-a8d427495310-kube-api-access-vjbw6\") pod \"d5c34809-17e3-41c0-85f5-a8d427495310\" (UID: \"d5c34809-17e3-41c0-85f5-a8d427495310\") " Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.002990 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5c34809-17e3-41c0-85f5-a8d427495310-config\") pod \"d5c34809-17e3-41c0-85f5-a8d427495310\" (UID: \"d5c34809-17e3-41c0-85f5-a8d427495310\") " Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.003032 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6chw\" (UniqueName: \"kubernetes.io/projected/f7c5666a-48ac-4d24-9539-bc255bd0ef8a-kube-api-access-b6chw\") pod \"f7c5666a-48ac-4d24-9539-bc255bd0ef8a\" (UID: \"f7c5666a-48ac-4d24-9539-bc255bd0ef8a\") " Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.003080 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c5666a-48ac-4d24-9539-bc255bd0ef8a-config\") pod \"f7c5666a-48ac-4d24-9539-bc255bd0ef8a\" (UID: \"f7c5666a-48ac-4d24-9539-bc255bd0ef8a\") " Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.003133 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5c34809-17e3-41c0-85f5-a8d427495310-dns-svc\") pod \"d5c34809-17e3-41c0-85f5-a8d427495310\" (UID: \"d5c34809-17e3-41c0-85f5-a8d427495310\") " Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.003796 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5c34809-17e3-41c0-85f5-a8d427495310-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5c34809-17e3-41c0-85f5-a8d427495310" (UID: "d5c34809-17e3-41c0-85f5-a8d427495310"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.004191 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c5666a-48ac-4d24-9539-bc255bd0ef8a-config" (OuterVolumeSpecName: "config") pod "f7c5666a-48ac-4d24-9539-bc255bd0ef8a" (UID: "f7c5666a-48ac-4d24-9539-bc255bd0ef8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.004271 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5c34809-17e3-41c0-85f5-a8d427495310-config" (OuterVolumeSpecName: "config") pod "d5c34809-17e3-41c0-85f5-a8d427495310" (UID: "d5c34809-17e3-41c0-85f5-a8d427495310"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.013075 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c5666a-48ac-4d24-9539-bc255bd0ef8a-kube-api-access-b6chw" (OuterVolumeSpecName: "kube-api-access-b6chw") pod "f7c5666a-48ac-4d24-9539-bc255bd0ef8a" (UID: "f7c5666a-48ac-4d24-9539-bc255bd0ef8a"). InnerVolumeSpecName "kube-api-access-b6chw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.013716 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c34809-17e3-41c0-85f5-a8d427495310-kube-api-access-vjbw6" (OuterVolumeSpecName: "kube-api-access-vjbw6") pod "d5c34809-17e3-41c0-85f5-a8d427495310" (UID: "d5c34809-17e3-41c0-85f5-a8d427495310"). InnerVolumeSpecName "kube-api-access-vjbw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.105060 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjbw6\" (UniqueName: \"kubernetes.io/projected/d5c34809-17e3-41c0-85f5-a8d427495310-kube-api-access-vjbw6\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.105092 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5c34809-17e3-41c0-85f5-a8d427495310-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.105101 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6chw\" (UniqueName: \"kubernetes.io/projected/f7c5666a-48ac-4d24-9539-bc255bd0ef8a-kube-api-access-b6chw\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.105110 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c5666a-48ac-4d24-9539-bc255bd0ef8a-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.105118 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5c34809-17e3-41c0-85f5-a8d427495310-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.195141 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-c6qp7" Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.195165 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-c6qp7" event={"ID":"f7c5666a-48ac-4d24-9539-bc255bd0ef8a","Type":"ContainerDied","Data":"f7b043486d0fe0cee77da83d48d8650fc18bd858480f072ff3ea8cc3aaaf5c04"} Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.196691 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" event={"ID":"d5c34809-17e3-41c0-85f5-a8d427495310","Type":"ContainerDied","Data":"3860b83495f17b8ae61afe52d097d4d66019e8f5965a24d7a6d227b7641394e5"} Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.196764 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rgz7z" Feb 25 16:10:21 crc kubenswrapper[4937]: E0225 16:10:21.198555 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe" Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.328634 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rgz7z"] Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.336277 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rgz7z"] Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.370557 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c6qp7"] Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.394963 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5c34809-17e3-41c0-85f5-a8d427495310" path="/var/lib/kubelet/pods/d5c34809-17e3-41c0-85f5-a8d427495310/volumes" Feb 25 16:10:21 crc kubenswrapper[4937]: I0225 16:10:21.395417 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c6qp7"] Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.231436 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2ddd71fd-4c47-4357-87e7-16a2010a23df","Type":"ContainerStarted","Data":"9ed94b2ef95a28eace28b5718109e43b3f50741719fe4cfa048e6c3bf3bfef3c"} Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.232083 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.235194 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9e2484d7-6d50-43d2-9105-e83280f565ac","Type":"ContainerStarted","Data":"555a0f0fffa5a3625cfa1e73fe964c9108d8ff978d468033239bf0157cd76619"} Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.282164 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.904930843 podStartE2EDuration="28.282147416s" podCreationTimestamp="2026-02-25 16:09:55 +0000 UTC" firstStartedPulling="2026-02-25 16:09:56.698675387 +0000 UTC m=+1447.712067267" lastFinishedPulling="2026-02-25 16:10:22.07589195 +0000 UTC m=+1473.089283840" observedRunningTime="2026-02-25 16:10:23.261353675 +0000 UTC m=+1474.274745565" watchObservedRunningTime="2026-02-25 16:10:23.282147416 +0000 UTC m=+1474.295539306" Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.393637 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c5666a-48ac-4d24-9539-bc255bd0ef8a" path="/var/lib/kubelet/pods/f7c5666a-48ac-4d24-9539-bc255bd0ef8a/volumes" Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.444240 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533930-xjhtn"] Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.452069 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sgdhb"] Feb 25 16:10:23 crc kubenswrapper[4937]: W0225 16:10:23.509357 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0b0baed_3140_4ac4_9d27_e8fc15c390c2.slice/crio-a829f79e9b1cc1fb82439f073b54a933320bad39df1f2fb66adf8eab5f94e978 WatchSource:0}: Error finding container a829f79e9b1cc1fb82439f073b54a933320bad39df1f2fb66adf8eab5f94e978: Status 404 returned error can't find the container with id a829f79e9b1cc1fb82439f073b54a933320bad39df1f2fb66adf8eab5f94e978 Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.521179 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p"] Feb 25 16:10:23 crc kubenswrapper[4937]: W0225 16:10:23.548088 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4c8966b_44e5_42fd_ae20_d3099876ee36.slice/crio-becc4bc49b242a813a4dc2dd5660a53dab767f21986e2bb91df5038b60b4ff23 WatchSource:0}: Error finding container becc4bc49b242a813a4dc2dd5660a53dab767f21986e2bb91df5038b60b4ff23: Status 404 returned error can't find the container with id becc4bc49b242a813a4dc2dd5660a53dab767f21986e2bb91df5038b60b4ff23 Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.548667 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.565908 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7"] Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.576418 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.582815 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.591013 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:10:23 crc kubenswrapper[4937]: W0225 16:10:23.643363 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f90fdcc_629f_46e9_9485_de80d43ea155.slice/crio-438dffc0133f7ac2f50c675b2864d60579b1e4a90b9b3ec7c4a4c9e908b21271 WatchSource:0}: Error finding container 438dffc0133f7ac2f50c675b2864d60579b1e4a90b9b3ec7c4a4c9e908b21271: Status 404 returned error can't find the container with id 438dffc0133f7ac2f50c675b2864d60579b1e4a90b9b3ec7c4a4c9e908b21271 Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.694238 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss"] Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.720302 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-86646"] Feb 25 16:10:23 crc kubenswrapper[4937]: W0225 16:10:23.722318 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf72068e0_28e8_4c10_abeb_c067fe29c2f4.slice/crio-629e59e1fca226850205cef7858b59b9ea04d468e16f15f6306811387e04387b WatchSource:0}: Error finding container 629e59e1fca226850205cef7858b59b9ea04d468e16f15f6306811387e04387b: Status 404 returned error can't find the container with id 629e59e1fca226850205cef7858b59b9ea04d468e16f15f6306811387e04387b Feb 25 16:10:23 crc kubenswrapper[4937]: E0225 16:10:23.725379 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-querier,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=querier -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p4zg8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-querier-58c84b5844-86646_openstack(f72068e0-28e8-4c10-abeb-c067fe29c2f4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 16:10:23 crc kubenswrapper[4937]: E0225 16:10:23.726814 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" podUID="f72068e0-28e8-4c10-abeb-c067fe29c2f4" Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.731992 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.743554 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7"] Feb 25 16:10:23 crc kubenswrapper[4937]: I0225 16:10:23.749599 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 25 16:10:23 crc kubenswrapper[4937]: E0225 16:10:23.764202 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-compactor,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=compactor -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-compactor-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-compactor-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6tgv9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-compactor-0_openstack(123b3439-f7ab-44b4-bbed-02539668cf80): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 16:10:23 crc kubenswrapper[4937]: E0225 16:10:23.765543 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="123b3439-f7ab-44b4-bbed-02539668cf80" Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.119670 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 25 16:10:24 crc kubenswrapper[4937]: W0225 16:10:24.125234 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a044be7_a58d_4684_8252_5a850694fb04.slice/crio-8d7224cfac8dba0436706051d2d0b2be6be88f8bd4a7506da38d6f1bbb7c4d79 WatchSource:0}: Error finding container 8d7224cfac8dba0436706051d2d0b2be6be88f8bd4a7506da38d6f1bbb7c4d79: Status 404 returned error can't find the container with id 8d7224cfac8dba0436706051d2d0b2be6be88f8bd4a7506da38d6f1bbb7c4d79 Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.243658 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"c4c8966b-44e5-42fd-ae20-d3099876ee36","Type":"ContainerStarted","Data":"becc4bc49b242a813a4dc2dd5660a53dab767f21986e2bb91df5038b60b4ff23"} Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.244780 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d34734ba-2195-4ea2-aa76-654c3c85a206","Type":"ContainerStarted","Data":"13a26941777b61858e36029569e328b328b8267618d5b0cee4c672432c11c760"} Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.245868 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" event={"ID":"5d351b94-5168-4f7f-9d70-c2cd2225dba8","Type":"ContainerStarted","Data":"964c5447470a5bafc40f6b4de4a350ba6c8a061d1c0fb9ab4e44aa3347084258"} Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.247061 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"123b3439-f7ab-44b4-bbed-02539668cf80","Type":"ContainerStarted","Data":"deaa29ecb20af689376df152517d309e1db6cdfce7eed3445790f33c310d8158"} Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.248406 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0a044be7-a58d-4684-8252-5a850694fb04","Type":"ContainerStarted","Data":"8d7224cfac8dba0436706051d2d0b2be6be88f8bd4a7506da38d6f1bbb7c4d79"} Feb 25 16:10:24 crc kubenswrapper[4937]: E0225 16:10:24.248669 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="123b3439-f7ab-44b4-bbed-02539668cf80" Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.249520 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"08382e6d-e8e5-4656-a524-26c8269114fd","Type":"ContainerStarted","Data":"1c14806eb230cbf50828b4bb3658ff72af077d1e104dafd7fdf40b49857caa34"} Feb 25 16:10:24 crc kubenswrapper[4937]: E0225 16:10:24.256990 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" podUID="f72068e0-28e8-4c10-abeb-c067fe29c2f4" Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.251060 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" event={"ID":"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e","Type":"ContainerStarted","Data":"afb2604a44f84a66b7dfe67eea014b043d04598c08a80641348d823452285971"} Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.257768 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9f90fdcc-629f-46e9-9485-de80d43ea155","Type":"ContainerStarted","Data":"438dffc0133f7ac2f50c675b2864d60579b1e4a90b9b3ec7c4a4c9e908b21271"} Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.258022 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sgdhb" event={"ID":"c0b0baed-3140-4ac4-9d27-e8fc15c390c2","Type":"ContainerStarted","Data":"a829f79e9b1cc1fb82439f073b54a933320bad39df1f2fb66adf8eab5f94e978"} Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.258040 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" event={"ID":"f72068e0-28e8-4c10-abeb-c067fe29c2f4","Type":"ContainerStarted","Data":"629e59e1fca226850205cef7858b59b9ea04d468e16f15f6306811387e04387b"} Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.258056 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d773f4d2-bec3-4379-a7a2-29975a18c85b","Type":"ContainerStarted","Data":"fc841500aef0869e4634a47f6f6d02e14e91acad403fdca62e197340b3ed5b0a"} Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.258069 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533930-xjhtn" event={"ID":"005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8","Type":"ContainerStarted","Data":"b2b3d8d4ba79239cfbc6e601017be3de5f1c145b4e25378254929e9dcd1c0bfb"} Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.258647 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" event={"ID":"836ae71a-cf0f-4a00-a0bc-78d1be68f830","Type":"ContainerStarted","Data":"fd26fe17a76fb0f9c0791a45cec8fb3f7f124e8cbd61a4ca5c92338131bb6be3"} Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.260210 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" event={"ID":"48e5f2c6-d4ed-48a1-8737-693b54c43613","Type":"ContainerStarted","Data":"fad4ad73aa68d8d41dd61e8304b73c00cd357f8ee0b54274593fd02db3341b48"} Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.713228 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 25 16:10:24 crc kubenswrapper[4937]: W0225 16:10:24.726945 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda02ae7ff_cbe3_4dae_9c3f_6dd285e0bd08.slice/crio-118959bd963ee34adf50e8c4f28498a0460742c01cb9be434472b2eb78af2fb4 WatchSource:0}: Error finding container 118959bd963ee34adf50e8c4f28498a0460742c01cb9be434472b2eb78af2fb4: Status 404 returned error can't find the container with id 118959bd963ee34adf50e8c4f28498a0460742c01cb9be434472b2eb78af2fb4 Feb 25 16:10:24 crc kubenswrapper[4937]: I0225 16:10:24.784603 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4rsxl"] Feb 25 16:10:24 crc kubenswrapper[4937]: W0225 16:10:24.786272 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod388f0d04_d580_46ae_a729_667d81ad11a0.slice/crio-0c3340df9c74c084eead1a9a55e5277999c50aa472eb94dfc5f7e3dfc0657c93 WatchSource:0}: Error finding container 0c3340df9c74c084eead1a9a55e5277999c50aa472eb94dfc5f7e3dfc0657c93: Status 404 returned error can't find the container with id 0c3340df9c74c084eead1a9a55e5277999c50aa472eb94dfc5f7e3dfc0657c93 Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.272549 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4rsxl" event={"ID":"388f0d04-d580-46ae-a729-667d81ad11a0","Type":"ContainerStarted","Data":"0c3340df9c74c084eead1a9a55e5277999c50aa472eb94dfc5f7e3dfc0657c93"} Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.274696 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08","Type":"ContainerStarted","Data":"118959bd963ee34adf50e8c4f28498a0460742c01cb9be434472b2eb78af2fb4"} Feb 25 16:10:25 crc kubenswrapper[4937]: E0225 16:10:25.276907 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" podUID="f72068e0-28e8-4c10-abeb-c067fe29c2f4" Feb 25 16:10:25 crc kubenswrapper[4937]: E0225 16:10:25.276990 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="123b3439-f7ab-44b4-bbed-02539668cf80" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.467620 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-lkm6g"] Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.469037 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.472726 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.485180 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lkm6g"] Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.581842 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c73995-8885-40f2-8491-6216d1ec5c7b-combined-ca-bundle\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.581940 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9c73995-8885-40f2-8491-6216d1ec5c7b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.582177 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9c73995-8885-40f2-8491-6216d1ec5c7b-config\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.582369 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2t4r\" (UniqueName: \"kubernetes.io/projected/d9c73995-8885-40f2-8491-6216d1ec5c7b-kube-api-access-m2t4r\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.582696 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d9c73995-8885-40f2-8491-6216d1ec5c7b-ovs-rundir\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.583088 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d9c73995-8885-40f2-8491-6216d1ec5c7b-ovn-rundir\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.648780 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vdlq8"] Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.685022 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9c73995-8885-40f2-8491-6216d1ec5c7b-config\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.685082 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2t4r\" (UniqueName: \"kubernetes.io/projected/d9c73995-8885-40f2-8491-6216d1ec5c7b-kube-api-access-m2t4r\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.685150 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d9c73995-8885-40f2-8491-6216d1ec5c7b-ovs-rundir\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.685171 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d9c73995-8885-40f2-8491-6216d1ec5c7b-ovn-rundir\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.685217 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9c73995-8885-40f2-8491-6216d1ec5c7b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.685232 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c73995-8885-40f2-8491-6216d1ec5c7b-combined-ca-bundle\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.685560 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d9c73995-8885-40f2-8491-6216d1ec5c7b-ovn-rundir\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.685653 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d9c73995-8885-40f2-8491-6216d1ec5c7b-ovs-rundir\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.686007 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9c73995-8885-40f2-8491-6216d1ec5c7b-config\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.700150 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c73995-8885-40f2-8491-6216d1ec5c7b-combined-ca-bundle\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.708084 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9c73995-8885-40f2-8491-6216d1ec5c7b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.732373 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-zhfq5"] Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.733906 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.736949 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2t4r\" (UniqueName: \"kubernetes.io/projected/d9c73995-8885-40f2-8491-6216d1ec5c7b-kube-api-access-m2t4r\") pod \"ovn-controller-metrics-lkm6g\" (UID: \"d9c73995-8885-40f2-8491-6216d1ec5c7b\") " pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.737285 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.760532 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-zhfq5"] Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.802939 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lkm6g" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.857801 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-72jcd"] Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.888510 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtddl\" (UniqueName: \"kubernetes.io/projected/6bc63570-c593-4ef8-bad6-efe017070990-kube-api-access-jtddl\") pod \"dnsmasq-dns-7f896c8c65-zhfq5\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.888589 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-zhfq5\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.888644 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-config\") pod \"dnsmasq-dns-7f896c8c65-zhfq5\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.888761 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-zhfq5\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.904978 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-ztrnz"] Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.907883 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.911279 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.963045 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-ztrnz"] Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.990188 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-zhfq5\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.990266 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtddl\" (UniqueName: \"kubernetes.io/projected/6bc63570-c593-4ef8-bad6-efe017070990-kube-api-access-jtddl\") pod \"dnsmasq-dns-7f896c8c65-zhfq5\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.990322 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-zhfq5\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.990365 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-config\") pod \"dnsmasq-dns-7f896c8c65-zhfq5\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.991339 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-config\") pod \"dnsmasq-dns-7f896c8c65-zhfq5\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.994152 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-zhfq5\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:25 crc kubenswrapper[4937]: I0225 16:10:25.994783 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-zhfq5\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.020623 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtddl\" (UniqueName: \"kubernetes.io/projected/6bc63570-c593-4ef8-bad6-efe017070990-kube-api-access-jtddl\") pod \"dnsmasq-dns-7f896c8c65-zhfq5\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.093706 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-config\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.093788 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.094049 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.094152 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.094375 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48ntq\" (UniqueName: \"kubernetes.io/projected/215bd06c-8d29-4761-ac40-911e2f9fcd73-kube-api-access-48ntq\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.107177 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.196478 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-config\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.196565 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.196641 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.196671 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.196739 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48ntq\" (UniqueName: \"kubernetes.io/projected/215bd06c-8d29-4761-ac40-911e2f9fcd73-kube-api-access-48ntq\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.198202 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-config\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.199439 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.199939 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.201168 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.219104 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48ntq\" (UniqueName: \"kubernetes.io/projected/215bd06c-8d29-4761-ac40-911e2f9fcd73-kube-api-access-48ntq\") pod \"dnsmasq-dns-86db49b7ff-ztrnz\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.278740 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.280152 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.304248 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" event={"ID":"cc0714eb-ff24-4699-af32-8e894985faa4","Type":"ContainerDied","Data":"4c21b269bc7d9de3e89b502ab7ecea74b47b879c6efc5f8faf072cc38bc85ba8"} Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.304347 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vdlq8" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.401033 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc0714eb-ff24-4699-af32-8e894985faa4-dns-svc\") pod \"cc0714eb-ff24-4699-af32-8e894985faa4\" (UID: \"cc0714eb-ff24-4699-af32-8e894985faa4\") " Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.401081 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0714eb-ff24-4699-af32-8e894985faa4-config\") pod \"cc0714eb-ff24-4699-af32-8e894985faa4\" (UID: \"cc0714eb-ff24-4699-af32-8e894985faa4\") " Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.401125 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phkz6\" (UniqueName: \"kubernetes.io/projected/cc0714eb-ff24-4699-af32-8e894985faa4-kube-api-access-phkz6\") pod \"cc0714eb-ff24-4699-af32-8e894985faa4\" (UID: \"cc0714eb-ff24-4699-af32-8e894985faa4\") " Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.402979 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc0714eb-ff24-4699-af32-8e894985faa4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc0714eb-ff24-4699-af32-8e894985faa4" (UID: "cc0714eb-ff24-4699-af32-8e894985faa4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.403958 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc0714eb-ff24-4699-af32-8e894985faa4-config" (OuterVolumeSpecName: "config") pod "cc0714eb-ff24-4699-af32-8e894985faa4" (UID: "cc0714eb-ff24-4699-af32-8e894985faa4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.406467 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc0714eb-ff24-4699-af32-8e894985faa4-kube-api-access-phkz6" (OuterVolumeSpecName: "kube-api-access-phkz6") pod "cc0714eb-ff24-4699-af32-8e894985faa4" (UID: "cc0714eb-ff24-4699-af32-8e894985faa4"). InnerVolumeSpecName "kube-api-access-phkz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.505810 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc0714eb-ff24-4699-af32-8e894985faa4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.505843 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0714eb-ff24-4699-af32-8e894985faa4-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.505853 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phkz6\" (UniqueName: \"kubernetes.io/projected/cc0714eb-ff24-4699-af32-8e894985faa4-kube-api-access-phkz6\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.523461 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-72jcd" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.563229 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lkm6g"] Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.606740 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f1b597-5189-495f-bd75-69d65853615e-config\") pod \"c4f1b597-5189-495f-bd75-69d65853615e\" (UID: \"c4f1b597-5189-495f-bd75-69d65853615e\") " Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.606875 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4f1b597-5189-495f-bd75-69d65853615e-dns-svc\") pod \"c4f1b597-5189-495f-bd75-69d65853615e\" (UID: \"c4f1b597-5189-495f-bd75-69d65853615e\") " Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.607007 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wpz9\" (UniqueName: \"kubernetes.io/projected/c4f1b597-5189-495f-bd75-69d65853615e-kube-api-access-8wpz9\") pod \"c4f1b597-5189-495f-bd75-69d65853615e\" (UID: \"c4f1b597-5189-495f-bd75-69d65853615e\") " Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.616903 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f1b597-5189-495f-bd75-69d65853615e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4f1b597-5189-495f-bd75-69d65853615e" (UID: "c4f1b597-5189-495f-bd75-69d65853615e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.616918 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f1b597-5189-495f-bd75-69d65853615e-config" (OuterVolumeSpecName: "config") pod "c4f1b597-5189-495f-bd75-69d65853615e" (UID: "c4f1b597-5189-495f-bd75-69d65853615e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.619073 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4f1b597-5189-495f-bd75-69d65853615e-kube-api-access-8wpz9" (OuterVolumeSpecName: "kube-api-access-8wpz9") pod "c4f1b597-5189-495f-bd75-69d65853615e" (UID: "c4f1b597-5189-495f-bd75-69d65853615e"). InnerVolumeSpecName "kube-api-access-8wpz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.681983 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vdlq8"] Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.692212 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vdlq8"] Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.709010 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f1b597-5189-495f-bd75-69d65853615e-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.709045 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4f1b597-5189-495f-bd75-69d65853615e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:26 crc kubenswrapper[4937]: I0225 16:10:26.709055 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wpz9\" (UniqueName: \"kubernetes.io/projected/c4f1b597-5189-495f-bd75-69d65853615e-kube-api-access-8wpz9\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:26 crc kubenswrapper[4937]: W0225 16:10:26.963195 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9c73995_8885_40f2_8491_6216d1ec5c7b.slice/crio-8b5b14f25136c73d4e682f5f84990679a9431e82234548cd34da3d89d517f5a9 WatchSource:0}: Error finding container 8b5b14f25136c73d4e682f5f84990679a9431e82234548cd34da3d89d517f5a9: Status 404 returned error can't find the container with id 8b5b14f25136c73d4e682f5f84990679a9431e82234548cd34da3d89d517f5a9 Feb 25 16:10:27 crc kubenswrapper[4937]: I0225 16:10:27.312359 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lkm6g" event={"ID":"d9c73995-8885-40f2-8491-6216d1ec5c7b","Type":"ContainerStarted","Data":"8b5b14f25136c73d4e682f5f84990679a9431e82234548cd34da3d89d517f5a9"} Feb 25 16:10:27 crc kubenswrapper[4937]: I0225 16:10:27.315836 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-72jcd" event={"ID":"c4f1b597-5189-495f-bd75-69d65853615e","Type":"ContainerDied","Data":"014779d44658eaa3c0675474ac2351b7101b077b742e35f7c9fe152e72461637"} Feb 25 16:10:27 crc kubenswrapper[4937]: I0225 16:10:27.315876 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-72jcd" Feb 25 16:10:27 crc kubenswrapper[4937]: I0225 16:10:27.412605 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc0714eb-ff24-4699-af32-8e894985faa4" path="/var/lib/kubelet/pods/cc0714eb-ff24-4699-af32-8e894985faa4/volumes" Feb 25 16:10:27 crc kubenswrapper[4937]: I0225 16:10:27.413110 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-72jcd"] Feb 25 16:10:27 crc kubenswrapper[4937]: I0225 16:10:27.413138 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-72jcd"] Feb 25 16:10:28 crc kubenswrapper[4937]: I0225 16:10:28.327561 4937 generic.go:334] "Generic (PLEG): container finished" podID="9e2484d7-6d50-43d2-9105-e83280f565ac" containerID="555a0f0fffa5a3625cfa1e73fe964c9108d8ff978d468033239bf0157cd76619" exitCode=0 Feb 25 16:10:28 crc kubenswrapper[4937]: I0225 16:10:28.327642 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9e2484d7-6d50-43d2-9105-e83280f565ac","Type":"ContainerDied","Data":"555a0f0fffa5a3625cfa1e73fe964c9108d8ff978d468033239bf0157cd76619"} Feb 25 16:10:28 crc kubenswrapper[4937]: I0225 16:10:28.459727 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-zhfq5"] Feb 25 16:10:29 crc kubenswrapper[4937]: I0225 16:10:29.381530 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4f1b597-5189-495f-bd75-69d65853615e" path="/var/lib/kubelet/pods/c4f1b597-5189-495f-bd75-69d65853615e/volumes" Feb 25 16:10:30 crc kubenswrapper[4937]: I0225 16:10:30.196562 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-ztrnz"] Feb 25 16:10:30 crc kubenswrapper[4937]: W0225 16:10:30.566645 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bc63570_c593_4ef8_bad6_efe017070990.slice/crio-4b710f379202c9d733efb9beddebfbea36b52604032f135614a5b6696ed8136c WatchSource:0}: Error finding container 4b710f379202c9d733efb9beddebfbea36b52604032f135614a5b6696ed8136c: Status 404 returned error can't find the container with id 4b710f379202c9d733efb9beddebfbea36b52604032f135614a5b6696ed8136c Feb 25 16:10:30 crc kubenswrapper[4937]: W0225 16:10:30.640203 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod215bd06c_8d29_4761_ac40_911e2f9fcd73.slice/crio-f3e37f6b53165fa9bcce829fedef12ea4cfc2e4ec669e5b51f3180e2946dbe3c WatchSource:0}: Error finding container f3e37f6b53165fa9bcce829fedef12ea4cfc2e4ec669e5b51f3180e2946dbe3c: Status 404 returned error can't find the container with id f3e37f6b53165fa9bcce829fedef12ea4cfc2e4ec669e5b51f3180e2946dbe3c Feb 25 16:10:31 crc kubenswrapper[4937]: I0225 16:10:31.156642 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 25 16:10:31 crc kubenswrapper[4937]: I0225 16:10:31.399053 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" event={"ID":"6bc63570-c593-4ef8-bad6-efe017070990","Type":"ContainerStarted","Data":"4b710f379202c9d733efb9beddebfbea36b52604032f135614a5b6696ed8136c"} Feb 25 16:10:31 crc kubenswrapper[4937]: I0225 16:10:31.399091 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" event={"ID":"215bd06c-8d29-4761-ac40-911e2f9fcd73","Type":"ContainerStarted","Data":"f3e37f6b53165fa9bcce829fedef12ea4cfc2e4ec669e5b51f3180e2946dbe3c"} Feb 25 16:10:31 crc kubenswrapper[4937]: I0225 16:10:31.399103 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533930-xjhtn" event={"ID":"005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8","Type":"ContainerStarted","Data":"3c92e8885d8aa5b20c72edf65a0d619e10ed84ce6ea98e7f185ac4ccdd27e091"} Feb 25 16:10:31 crc kubenswrapper[4937]: I0225 16:10:31.719749 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533930-xjhtn" podStartSLOduration=28.73908736 podStartE2EDuration="31.719722852s" podCreationTimestamp="2026-02-25 16:10:00 +0000 UTC" firstStartedPulling="2026-02-25 16:10:23.46144471 +0000 UTC m=+1474.474836610" lastFinishedPulling="2026-02-25 16:10:26.442080212 +0000 UTC m=+1477.455472102" observedRunningTime="2026-02-25 16:10:31.7180544 +0000 UTC m=+1482.731446290" watchObservedRunningTime="2026-02-25 16:10:31.719722852 +0000 UTC m=+1482.733114752" Feb 25 16:10:32 crc kubenswrapper[4937]: I0225 16:10:32.411616 4937 generic.go:334] "Generic (PLEG): container finished" podID="005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8" containerID="3c92e8885d8aa5b20c72edf65a0d619e10ed84ce6ea98e7f185ac4ccdd27e091" exitCode=0 Feb 25 16:10:32 crc kubenswrapper[4937]: I0225 16:10:32.411648 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533930-xjhtn" event={"ID":"005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8","Type":"ContainerDied","Data":"3c92e8885d8aa5b20c72edf65a0d619e10ed84ce6ea98e7f185ac4ccdd27e091"} Feb 25 16:10:34 crc kubenswrapper[4937]: I0225 16:10:34.430032 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9e2484d7-6d50-43d2-9105-e83280f565ac","Type":"ContainerStarted","Data":"52ea0dbc570d64b2967446503d8e1d4d08062b883f19213ab630e23bb32a82ef"} Feb 25 16:10:34 crc kubenswrapper[4937]: I0225 16:10:34.433563 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" event={"ID":"e13a5d7c-5a1a-466b-83a0-d76859e2cd3e","Type":"ContainerStarted","Data":"8961549ba76a9136a6b294bd9d33da1c41d67e0f91b32881560185ca6a8ef2f2"} Feb 25 16:10:34 crc kubenswrapper[4937]: I0225 16:10:34.433792 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:34 crc kubenswrapper[4937]: I0225 16:10:34.436278 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533930-xjhtn" event={"ID":"005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8","Type":"ContainerDied","Data":"b2b3d8d4ba79239cfbc6e601017be3de5f1c145b4e25378254929e9dcd1c0bfb"} Feb 25 16:10:34 crc kubenswrapper[4937]: I0225 16:10:34.436312 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2b3d8d4ba79239cfbc6e601017be3de5f1c145b4e25378254929e9dcd1c0bfb" Feb 25 16:10:34 crc kubenswrapper[4937]: I0225 16:10:34.444245 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" Feb 25 16:10:34 crc kubenswrapper[4937]: I0225 16:10:34.461441 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=14.896787074 podStartE2EDuration="40.461415115s" podCreationTimestamp="2026-02-25 16:09:54 +0000 UTC" firstStartedPulling="2026-02-25 16:09:56.569781496 +0000 UTC m=+1447.583173386" lastFinishedPulling="2026-02-25 16:10:22.134409527 +0000 UTC m=+1473.147801427" observedRunningTime="2026-02-25 16:10:34.456980184 +0000 UTC m=+1485.470372304" watchObservedRunningTime="2026-02-25 16:10:34.461415115 +0000 UTC m=+1485.474807025" Feb 25 16:10:34 crc kubenswrapper[4937]: I0225 16:10:34.482944 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-t9wss" podStartSLOduration=17.558484541 podStartE2EDuration="25.482921124s" podCreationTimestamp="2026-02-25 16:10:09 +0000 UTC" firstStartedPulling="2026-02-25 16:10:23.706311418 +0000 UTC m=+1474.719703308" lastFinishedPulling="2026-02-25 16:10:31.630747981 +0000 UTC m=+1482.644139891" observedRunningTime="2026-02-25 16:10:34.478295858 +0000 UTC m=+1485.491687768" watchObservedRunningTime="2026-02-25 16:10:34.482921124 +0000 UTC m=+1485.496313014" Feb 25 16:10:35 crc kubenswrapper[4937]: I0225 16:10:35.452290 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" event={"ID":"48e5f2c6-d4ed-48a1-8737-693b54c43613","Type":"ContainerStarted","Data":"8ae8abdbffa4b61a6d5e5d8205d5975ac38b0f02fe4695e7d4f3844e8079d5a7"} Feb 25 16:10:35 crc kubenswrapper[4937]: I0225 16:10:35.452810 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:35 crc kubenswrapper[4937]: I0225 16:10:35.489645 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" podStartSLOduration=17.940149687999998 podStartE2EDuration="26.489626808s" podCreationTimestamp="2026-02-25 16:10:09 +0000 UTC" firstStartedPulling="2026-02-25 16:10:23.651205027 +0000 UTC m=+1474.664596917" lastFinishedPulling="2026-02-25 16:10:32.200682147 +0000 UTC m=+1483.214074037" observedRunningTime="2026-02-25 16:10:35.485985287 +0000 UTC m=+1486.499377177" watchObservedRunningTime="2026-02-25 16:10:35.489626808 +0000 UTC m=+1486.503018698" Feb 25 16:10:35 crc kubenswrapper[4937]: I0225 16:10:35.752974 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533930-xjhtn" Feb 25 16:10:35 crc kubenswrapper[4937]: I0225 16:10:35.799648 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbswk\" (UniqueName: \"kubernetes.io/projected/005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8-kube-api-access-wbswk\") pod \"005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8\" (UID: \"005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8\") " Feb 25 16:10:35 crc kubenswrapper[4937]: I0225 16:10:35.922919 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8-kube-api-access-wbswk" (OuterVolumeSpecName: "kube-api-access-wbswk") pod "005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8" (UID: "005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8"). InnerVolumeSpecName "kube-api-access-wbswk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.002429 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbswk\" (UniqueName: \"kubernetes.io/projected/005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8-kube-api-access-wbswk\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.049174 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.049219 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.463212 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"08382e6d-e8e5-4656-a524-26c8269114fd","Type":"ContainerStarted","Data":"87ce7c40ce3fb50261623a7254d3bcc087d65abb04483d29dba875af551574c0"} Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.463404 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.465955 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"c4c8966b-44e5-42fd-ae20-d3099876ee36","Type":"ContainerStarted","Data":"1a300e7fcc8c5b8313f466e632471bec9b2851dd621d4167a28726278c63d7b8"} Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.466295 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.468658 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" event={"ID":"5d351b94-5168-4f7f-9d70-c2cd2225dba8","Type":"ContainerStarted","Data":"a6151df173b113a3eb765e764ccdcacb3934957718cac2b34acc633eff8397cf"} Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.469512 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.474731 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533930-xjhtn" Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.474745 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0a044be7-a58d-4684-8252-5a850694fb04","Type":"ContainerStarted","Data":"1ecf35f98f40bccc66b8fd79aca1d7e4fca9c471f77fe5c2a49a96eefba0e90e"} Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.493803 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=18.582623012 podStartE2EDuration="27.493784589s" podCreationTimestamp="2026-02-25 16:10:09 +0000 UTC" firstStartedPulling="2026-02-25 16:10:23.717758835 +0000 UTC m=+1474.731150725" lastFinishedPulling="2026-02-25 16:10:32.628920412 +0000 UTC m=+1483.642312302" observedRunningTime="2026-02-25 16:10:36.489572323 +0000 UTC m=+1487.502964213" watchObservedRunningTime="2026-02-25 16:10:36.493784589 +0000 UTC m=+1487.507176479" Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.542623 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=18.464753537 podStartE2EDuration="27.542604632s" podCreationTimestamp="2026-02-25 16:10:09 +0000 UTC" firstStartedPulling="2026-02-25 16:10:23.55238764 +0000 UTC m=+1474.565779530" lastFinishedPulling="2026-02-25 16:10:32.630238735 +0000 UTC m=+1483.643630625" observedRunningTime="2026-02-25 16:10:36.52056951 +0000 UTC m=+1487.533961400" watchObservedRunningTime="2026-02-25 16:10:36.542604632 +0000 UTC m=+1487.555996522" Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.544122 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" podStartSLOduration=18.463078934 podStartE2EDuration="27.54411538s" podCreationTimestamp="2026-02-25 16:10:09 +0000 UTC" firstStartedPulling="2026-02-25 16:10:23.548146723 +0000 UTC m=+1474.561538613" lastFinishedPulling="2026-02-25 16:10:32.629183169 +0000 UTC m=+1483.642575059" observedRunningTime="2026-02-25 16:10:36.536427097 +0000 UTC m=+1487.549818987" watchObservedRunningTime="2026-02-25 16:10:36.54411538 +0000 UTC m=+1487.557507270" Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.822142 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533924-whqrx"] Feb 25 16:10:36 crc kubenswrapper[4937]: I0225 16:10:36.825887 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533924-whqrx"] Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.386380 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8ea7960-4601-40d9-b43a-69a2799d10c8" path="/var/lib/kubelet/pods/f8ea7960-4601-40d9-b43a-69a2799d10c8/volumes" Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.486984 4937 generic.go:334] "Generic (PLEG): container finished" podID="215bd06c-8d29-4761-ac40-911e2f9fcd73" containerID="ea7075aff1e1f8be4ecf2beeb84ea13807662c8c37a50c90aaf98862f00803ef" exitCode=0 Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.487059 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" event={"ID":"215bd06c-8d29-4761-ac40-911e2f9fcd73","Type":"ContainerDied","Data":"ea7075aff1e1f8be4ecf2beeb84ea13807662c8c37a50c90aaf98862f00803ef"} Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.488646 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d34734ba-2195-4ea2-aa76-654c3c85a206","Type":"ContainerStarted","Data":"6ac683d3e2d86111c3d04b83d23a991feccf0736c9fcc99f088f3ac5b5fabf4a"} Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.488781 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.490723 4937 generic.go:334] "Generic (PLEG): container finished" podID="388f0d04-d580-46ae-a729-667d81ad11a0" containerID="e9804fc3a2eed6a30597b41ca6024c0e00bf7a491ad41015c4c959c222619bbd" exitCode=0 Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.490859 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4rsxl" event={"ID":"388f0d04-d580-46ae-a729-667d81ad11a0","Type":"ContainerDied","Data":"e9804fc3a2eed6a30597b41ca6024c0e00bf7a491ad41015c4c959c222619bbd"} Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.495909 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08","Type":"ContainerStarted","Data":"ac923dc80f719124468c7e138fb127adb6401edd28d7dbbfbcc173996bc5404f"} Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.500725 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0a044be7-a58d-4684-8252-5a850694fb04","Type":"ContainerStarted","Data":"8a666e5d3e620cd41f36e3171a559b2ec57e1f99ef1f6a5d7244fc7352c6f054"} Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.506068 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lkm6g" event={"ID":"d9c73995-8885-40f2-8491-6216d1ec5c7b","Type":"ContainerStarted","Data":"de284db977a0479635354f5716744c841d17519d8dfeb526f61c84ec75d484ee"} Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.510736 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d773f4d2-bec3-4379-a7a2-29975a18c85b","Type":"ContainerStarted","Data":"4cd0ad3573b75b13bcc09c4585a1fd172977199904c71b28892a8264d25edd65"} Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.513680 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9f90fdcc-629f-46e9-9485-de80d43ea155","Type":"ContainerStarted","Data":"e32308d0642d3c8905970599b30df4838643779f555870be4d3c24d9df44e71d"} Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.513825 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9f90fdcc-629f-46e9-9485-de80d43ea155" containerName="init-config-reloader" containerID="cri-o://e32308d0642d3c8905970599b30df4838643779f555870be4d3c24d9df44e71d" gracePeriod=600 Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.523641 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" event={"ID":"836ae71a-cf0f-4a00-a0bc-78d1be68f830","Type":"ContainerStarted","Data":"79479beb9e52d5988d13b9e619b7ea909cf83e86b32a2dd4726093311a9f63fd"} Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.524263 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.526705 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sgdhb" event={"ID":"c0b0baed-3140-4ac4-9d27-e8fc15c390c2","Type":"ContainerStarted","Data":"6a45dd79b6bd5f99581139683c8d373f0461ff97c2b1cd097a98b3970a9ce6ae"} Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.526967 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-sgdhb" Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.534764 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe","Type":"ContainerStarted","Data":"986aef7e61edd7d3ded7eb3186350b7282ecbe6821e865a04eef05e2444099ac"} Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.539272 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-lkm6g" podStartSLOduration=6.681823271 podStartE2EDuration="12.539258274s" podCreationTimestamp="2026-02-25 16:10:25 +0000 UTC" firstStartedPulling="2026-02-25 16:10:26.970652501 +0000 UTC m=+1477.984044381" lastFinishedPulling="2026-02-25 16:10:32.828087494 +0000 UTC m=+1483.841479384" observedRunningTime="2026-02-25 16:10:37.537012968 +0000 UTC m=+1488.550404898" watchObservedRunningTime="2026-02-25 16:10:37.539258274 +0000 UTC m=+1488.552650164" Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.540596 4937 generic.go:334] "Generic (PLEG): container finished" podID="6bc63570-c593-4ef8-bad6-efe017070990" containerID="f20aba585442d21cf519c7a6ad3d20f9acb0729805331d6d2530b58b44056b38" exitCode=0 Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.543347 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" event={"ID":"6bc63570-c593-4ef8-bad6-efe017070990","Type":"ContainerDied","Data":"f20aba585442d21cf519c7a6ad3d20f9acb0729805331d6d2530b58b44056b38"} Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.571834 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.581311 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=24.263734501 podStartE2EDuration="32.581291538s" podCreationTimestamp="2026-02-25 16:10:05 +0000 UTC" firstStartedPulling="2026-02-25 16:10:24.127703891 +0000 UTC m=+1475.141095781" lastFinishedPulling="2026-02-25 16:10:32.445260928 +0000 UTC m=+1483.458652818" observedRunningTime="2026-02-25 16:10:37.578800275 +0000 UTC m=+1488.592192175" watchObservedRunningTime="2026-02-25 16:10:37.581291538 +0000 UTC m=+1488.594683428" Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.613439 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=30.510930475 podStartE2EDuration="40.613418513s" podCreationTimestamp="2026-02-25 16:09:57 +0000 UTC" firstStartedPulling="2026-02-25 16:10:23.566527224 +0000 UTC m=+1474.579919114" lastFinishedPulling="2026-02-25 16:10:33.669015262 +0000 UTC m=+1484.682407152" observedRunningTime="2026-02-25 16:10:37.597269008 +0000 UTC m=+1488.610660898" watchObservedRunningTime="2026-02-25 16:10:37.613418513 +0000 UTC m=+1488.626810403" Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.755054 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-v7lt7" podStartSLOduration=19.847726372 podStartE2EDuration="28.755031533s" podCreationTimestamp="2026-02-25 16:10:09 +0000 UTC" firstStartedPulling="2026-02-25 16:10:23.721633692 +0000 UTC m=+1474.735025582" lastFinishedPulling="2026-02-25 16:10:32.628938853 +0000 UTC m=+1483.642330743" observedRunningTime="2026-02-25 16:10:37.749867343 +0000 UTC m=+1488.763259233" watchObservedRunningTime="2026-02-25 16:10:37.755031533 +0000 UTC m=+1488.768423423" Feb 25 16:10:37 crc kubenswrapper[4937]: I0225 16:10:37.797803 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-sgdhb" podStartSLOduration=26.868848512 podStartE2EDuration="35.797787154s" podCreationTimestamp="2026-02-25 16:10:02 +0000 UTC" firstStartedPulling="2026-02-25 16:10:23.516281685 +0000 UTC m=+1474.529673575" lastFinishedPulling="2026-02-25 16:10:32.445220327 +0000 UTC m=+1483.458612217" observedRunningTime="2026-02-25 16:10:37.785698341 +0000 UTC m=+1488.799090231" watchObservedRunningTime="2026-02-25 16:10:37.797787154 +0000 UTC m=+1488.811179044" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.143127 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-zhfq5"] Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.169042 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-6xfpd"] Feb 25 16:10:38 crc kubenswrapper[4937]: E0225 16:10:38.169472 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8" containerName="oc" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.169514 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8" containerName="oc" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.169734 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8" containerName="oc" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.182092 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.197461 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-6xfpd"] Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.247112 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-config\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.247405 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.247583 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.247701 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrncf\" (UniqueName: \"kubernetes.io/projected/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-kube-api-access-vrncf\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.248140 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-dns-svc\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.349372 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.349417 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.349449 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrncf\" (UniqueName: \"kubernetes.io/projected/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-kube-api-access-vrncf\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.349512 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-dns-svc\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.349576 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-config\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.350286 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.350460 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.354108 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-config\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.354601 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-dns-svc\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.392599 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrncf\" (UniqueName: \"kubernetes.io/projected/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-kube-api-access-vrncf\") pod \"dnsmasq-dns-698758b865-6xfpd\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.548349 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.550749 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"de5b4144-33d4-4860-9872-8826c78490a7","Type":"ContainerStarted","Data":"82dee4b670df39dc191f5c519f9747dc8a2893b9682ea4e53c75a02199de7f0c"} Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.552646 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" event={"ID":"f72068e0-28e8-4c10-abeb-c067fe29c2f4","Type":"ContainerStarted","Data":"0c61c983298b0343157f9275b8506ad74f65c966937cea76df7a8706e51b0906"} Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.552894 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.554386 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08","Type":"ContainerStarted","Data":"3497fe912dae94ad41d9b5526fe1e210ef00cb5001646675a91ef88c7b176b2b"} Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.556627 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" event={"ID":"6bc63570-c593-4ef8-bad6-efe017070990","Type":"ContainerStarted","Data":"cf9c4bf3250e41c312a1a74f96a131616d423a94d0192648ec34c66f6f3836cc"} Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.556726 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.556653 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" podUID="6bc63570-c593-4ef8-bad6-efe017070990" containerName="dnsmasq-dns" containerID="cri-o://cf9c4bf3250e41c312a1a74f96a131616d423a94d0192648ec34c66f6f3836cc" gracePeriod=10 Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.568732 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" event={"ID":"215bd06c-8d29-4761-ac40-911e2f9fcd73","Type":"ContainerStarted","Data":"98a65cc81711341e7ef85196cc49fec21f8c06d1c9955b78d7a5485722104a36"} Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.569060 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.572983 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9ebad40-444e-4250-85cb-2a154282cdf9","Type":"ContainerStarted","Data":"7d15cf71941dacdd51d4d3f984cb980362aba44baf3e3b14e00f057c6dd681fc"} Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.579525 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4rsxl" event={"ID":"388f0d04-d580-46ae-a729-667d81ad11a0","Type":"ContainerStarted","Data":"b9b7641d25253beba0fbf93c7d77bdbe7ce3982746fc9fa85dcd0079a321dea0"} Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.621907 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" podStartSLOduration=-9223372007.23289 podStartE2EDuration="29.621886241s" podCreationTimestamp="2026-02-25 16:10:09 +0000 UTC" firstStartedPulling="2026-02-25 16:10:23.725213282 +0000 UTC m=+1474.738605172" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:10:38.613584163 +0000 UTC m=+1489.626976053" watchObservedRunningTime="2026-02-25 16:10:38.621886241 +0000 UTC m=+1489.635278131" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.660528 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" podStartSLOduration=11.45721488 podStartE2EDuration="13.660508129s" podCreationTimestamp="2026-02-25 16:10:25 +0000 UTC" firstStartedPulling="2026-02-25 16:10:30.610640071 +0000 UTC m=+1481.624031961" lastFinishedPulling="2026-02-25 16:10:32.81393332 +0000 UTC m=+1483.827325210" observedRunningTime="2026-02-25 16:10:38.636426866 +0000 UTC m=+1489.649818766" watchObservedRunningTime="2026-02-25 16:10:38.660508129 +0000 UTC m=+1489.673900019" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.664742 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" podStartSLOduration=11.481788617 podStartE2EDuration="13.664711165s" podCreationTimestamp="2026-02-25 16:10:25 +0000 UTC" firstStartedPulling="2026-02-25 16:10:30.646234743 +0000 UTC m=+1481.659626633" lastFinishedPulling="2026-02-25 16:10:32.829157291 +0000 UTC m=+1483.842549181" observedRunningTime="2026-02-25 16:10:38.658401807 +0000 UTC m=+1489.671793697" watchObservedRunningTime="2026-02-25 16:10:38.664711165 +0000 UTC m=+1489.678103055" Feb 25 16:10:38 crc kubenswrapper[4937]: I0225 16:10:38.724052 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=28.007001398 podStartE2EDuration="35.724026282s" podCreationTimestamp="2026-02-25 16:10:03 +0000 UTC" firstStartedPulling="2026-02-25 16:10:24.7288701 +0000 UTC m=+1475.742261990" lastFinishedPulling="2026-02-25 16:10:32.445894984 +0000 UTC m=+1483.459286874" observedRunningTime="2026-02-25 16:10:38.717707933 +0000 UTC m=+1489.731099823" watchObservedRunningTime="2026-02-25 16:10:38.724026282 +0000 UTC m=+1489.737418172" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.050305 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-6xfpd"] Feb 25 16:10:39 crc kubenswrapper[4937]: W0225 16:10:39.056394 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8f0c1cf_9d3a_4974_9959_2a6327b9dac7.slice/crio-5fe907070a15ab85aceaea53620ddb63128a2a1a7b6b8625ef7e4170f81f61ba WatchSource:0}: Error finding container 5fe907070a15ab85aceaea53620ddb63128a2a1a7b6b8625ef7e4170f81f61ba: Status 404 returned error can't find the container with id 5fe907070a15ab85aceaea53620ddb63128a2a1a7b6b8625ef7e4170f81f61ba Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.238420 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.257001 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 25 16:10:39 crc kubenswrapper[4937]: E0225 16:10:39.257513 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc63570-c593-4ef8-bad6-efe017070990" containerName="init" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.257535 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc63570-c593-4ef8-bad6-efe017070990" containerName="init" Feb 25 16:10:39 crc kubenswrapper[4937]: E0225 16:10:39.257573 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc63570-c593-4ef8-bad6-efe017070990" containerName="dnsmasq-dns" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.257582 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc63570-c593-4ef8-bad6-efe017070990" containerName="dnsmasq-dns" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.257812 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc63570-c593-4ef8-bad6-efe017070990" containerName="dnsmasq-dns" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.263955 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.268821 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.268875 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-dthnz" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.268938 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.269798 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.288870 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.366925 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-dns-svc\") pod \"6bc63570-c593-4ef8-bad6-efe017070990\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.366987 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-ovsdbserver-sb\") pod \"6bc63570-c593-4ef8-bad6-efe017070990\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.367040 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-config\") pod \"6bc63570-c593-4ef8-bad6-efe017070990\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.367125 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtddl\" (UniqueName: \"kubernetes.io/projected/6bc63570-c593-4ef8-bad6-efe017070990-kube-api-access-jtddl\") pod \"6bc63570-c593-4ef8-bad6-efe017070990\" (UID: \"6bc63570-c593-4ef8-bad6-efe017070990\") " Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.367458 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76wnb\" (UniqueName: \"kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-kube-api-access-76wnb\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.367572 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/48d22af0-5579-46fb-889d-fd34e46d26e9-lock\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.367647 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/48d22af0-5579-46fb-889d-fd34e46d26e9-cache\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.367708 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-16e10e0c-6ce2-46e1-b9e8-7329f06fead3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16e10e0c-6ce2-46e1-b9e8-7329f06fead3\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.367735 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.367754 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48d22af0-5579-46fb-889d-fd34e46d26e9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.383752 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc63570-c593-4ef8-bad6-efe017070990-kube-api-access-jtddl" (OuterVolumeSpecName: "kube-api-access-jtddl") pod "6bc63570-c593-4ef8-bad6-efe017070990" (UID: "6bc63570-c593-4ef8-bad6-efe017070990"). InnerVolumeSpecName "kube-api-access-jtddl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.426183 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6bc63570-c593-4ef8-bad6-efe017070990" (UID: "6bc63570-c593-4ef8-bad6-efe017070990"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.429899 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-config" (OuterVolumeSpecName: "config") pod "6bc63570-c593-4ef8-bad6-efe017070990" (UID: "6bc63570-c593-4ef8-bad6-efe017070990"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.440798 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6bc63570-c593-4ef8-bad6-efe017070990" (UID: "6bc63570-c593-4ef8-bad6-efe017070990"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.451528 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.469590 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76wnb\" (UniqueName: \"kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-kube-api-access-76wnb\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.469685 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/48d22af0-5579-46fb-889d-fd34e46d26e9-lock\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.469814 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/48d22af0-5579-46fb-889d-fd34e46d26e9-cache\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.469854 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48d22af0-5579-46fb-889d-fd34e46d26e9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.469878 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-16e10e0c-6ce2-46e1-b9e8-7329f06fead3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16e10e0c-6ce2-46e1-b9e8-7329f06fead3\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.469901 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.469979 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.469989 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.469998 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bc63570-c593-4ef8-bad6-efe017070990-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.470007 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtddl\" (UniqueName: \"kubernetes.io/projected/6bc63570-c593-4ef8-bad6-efe017070990-kube-api-access-jtddl\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:39 crc kubenswrapper[4937]: E0225 16:10:39.470732 4937 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 16:10:39 crc kubenswrapper[4937]: E0225 16:10:39.470748 4937 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 16:10:39 crc kubenswrapper[4937]: E0225 16:10:39.470779 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift podName:48d22af0-5579-46fb-889d-fd34e46d26e9 nodeName:}" failed. No retries permitted until 2026-02-25 16:10:39.97076777 +0000 UTC m=+1490.984159660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift") pod "swift-storage-0" (UID: "48d22af0-5579-46fb-889d-fd34e46d26e9") : configmap "swift-ring-files" not found Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.472767 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/48d22af0-5579-46fb-889d-fd34e46d26e9-cache\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.473995 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.474023 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-16e10e0c-6ce2-46e1-b9e8-7329f06fead3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16e10e0c-6ce2-46e1-b9e8-7329f06fead3\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/94fc0d693ae20591fc3735d2e34f8eba69ec58d58a87a9ee7f8aaea3e303fb8d/globalmount\"" pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.474578 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/48d22af0-5579-46fb-889d-fd34e46d26e9-lock\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.477537 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48d22af0-5579-46fb-889d-fd34e46d26e9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.487988 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76wnb\" (UniqueName: \"kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-kube-api-access-76wnb\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.497544 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.511679 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-16e10e0c-6ce2-46e1-b9e8-7329f06fead3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16e10e0c-6ce2-46e1-b9e8-7329f06fead3\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.591756 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4rsxl" event={"ID":"388f0d04-d580-46ae-a729-667d81ad11a0","Type":"ContainerStarted","Data":"4a875634d866f593e4c28b8f2364cb4983b6f49a5b7019566482a2e2dcb0ffac"} Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.592899 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.595463 4937 generic.go:334] "Generic (PLEG): container finished" podID="6bc63570-c593-4ef8-bad6-efe017070990" containerID="cf9c4bf3250e41c312a1a74f96a131616d423a94d0192648ec34c66f6f3836cc" exitCode=0 Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.595547 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.595531 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" event={"ID":"6bc63570-c593-4ef8-bad6-efe017070990","Type":"ContainerDied","Data":"cf9c4bf3250e41c312a1a74f96a131616d423a94d0192648ec34c66f6f3836cc"} Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.595669 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-zhfq5" event={"ID":"6bc63570-c593-4ef8-bad6-efe017070990","Type":"ContainerDied","Data":"4b710f379202c9d733efb9beddebfbea36b52604032f135614a5b6696ed8136c"} Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.595694 4937 scope.go:117] "RemoveContainer" containerID="cf9c4bf3250e41c312a1a74f96a131616d423a94d0192648ec34c66f6f3836cc" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.604904 4937 generic.go:334] "Generic (PLEG): container finished" podID="b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" containerID="a240d8936cb0e8c601adb670e19ca512ef04be6aefd27f3fc71d9c22b2aa83ea" exitCode=0 Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.605066 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-6xfpd" event={"ID":"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7","Type":"ContainerDied","Data":"a240d8936cb0e8c601adb670e19ca512ef04be6aefd27f3fc71d9c22b2aa83ea"} Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.605195 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-6xfpd" event={"ID":"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7","Type":"ContainerStarted","Data":"5fe907070a15ab85aceaea53620ddb63128a2a1a7b6b8625ef7e4170f81f61ba"} Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.605778 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.625778 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-4rsxl" podStartSLOduration=30.275101324 podStartE2EDuration="37.625754225s" podCreationTimestamp="2026-02-25 16:10:02 +0000 UTC" firstStartedPulling="2026-02-25 16:10:24.788047583 +0000 UTC m=+1475.801439473" lastFinishedPulling="2026-02-25 16:10:32.138700484 +0000 UTC m=+1483.152092374" observedRunningTime="2026-02-25 16:10:39.613858376 +0000 UTC m=+1490.627250276" watchObservedRunningTime="2026-02-25 16:10:39.625754225 +0000 UTC m=+1490.639146125" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.640230 4937 scope.go:117] "RemoveContainer" containerID="f20aba585442d21cf519c7a6ad3d20f9acb0729805331d6d2530b58b44056b38" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.673456 4937 scope.go:117] "RemoveContainer" containerID="cf9c4bf3250e41c312a1a74f96a131616d423a94d0192648ec34c66f6f3836cc" Feb 25 16:10:39 crc kubenswrapper[4937]: E0225 16:10:39.687901 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9c4bf3250e41c312a1a74f96a131616d423a94d0192648ec34c66f6f3836cc\": container with ID starting with cf9c4bf3250e41c312a1a74f96a131616d423a94d0192648ec34c66f6f3836cc not found: ID does not exist" containerID="cf9c4bf3250e41c312a1a74f96a131616d423a94d0192648ec34c66f6f3836cc" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.687956 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9c4bf3250e41c312a1a74f96a131616d423a94d0192648ec34c66f6f3836cc"} err="failed to get container status \"cf9c4bf3250e41c312a1a74f96a131616d423a94d0192648ec34c66f6f3836cc\": rpc error: code = NotFound desc = could not find container \"cf9c4bf3250e41c312a1a74f96a131616d423a94d0192648ec34c66f6f3836cc\": container with ID starting with cf9c4bf3250e41c312a1a74f96a131616d423a94d0192648ec34c66f6f3836cc not found: ID does not exist" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.687994 4937 scope.go:117] "RemoveContainer" containerID="f20aba585442d21cf519c7a6ad3d20f9acb0729805331d6d2530b58b44056b38" Feb 25 16:10:39 crc kubenswrapper[4937]: E0225 16:10:39.688470 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f20aba585442d21cf519c7a6ad3d20f9acb0729805331d6d2530b58b44056b38\": container with ID starting with f20aba585442d21cf519c7a6ad3d20f9acb0729805331d6d2530b58b44056b38 not found: ID does not exist" containerID="f20aba585442d21cf519c7a6ad3d20f9acb0729805331d6d2530b58b44056b38" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.688598 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20aba585442d21cf519c7a6ad3d20f9acb0729805331d6d2530b58b44056b38"} err="failed to get container status \"f20aba585442d21cf519c7a6ad3d20f9acb0729805331d6d2530b58b44056b38\": rpc error: code = NotFound desc = could not find container \"f20aba585442d21cf519c7a6ad3d20f9acb0729805331d6d2530b58b44056b38\": container with ID starting with f20aba585442d21cf519c7a6ad3d20f9acb0729805331d6d2530b58b44056b38 not found: ID does not exist" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.693033 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-zhfq5"] Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.701851 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-zhfq5"] Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.839156 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-6nkbm"] Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.840501 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.852775 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.852818 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.852909 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.858358 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6nkbm"] Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.878237 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-swiftconf\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.878547 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vn2l\" (UniqueName: \"kubernetes.io/projected/0a0f0530-95e1-4231-9933-bedb49b72a88-kube-api-access-2vn2l\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.878724 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-dispersionconf\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.878920 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a0f0530-95e1-4231-9933-bedb49b72a88-ring-data-devices\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.879069 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a0f0530-95e1-4231-9933-bedb49b72a88-etc-swift\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.879169 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a0f0530-95e1-4231-9933-bedb49b72a88-scripts\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.879270 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-combined-ca-bundle\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.982675 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a0f0530-95e1-4231-9933-bedb49b72a88-etc-swift\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.983066 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a0f0530-95e1-4231-9933-bedb49b72a88-scripts\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.983093 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-combined-ca-bundle\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.983147 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-swiftconf\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.983181 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vn2l\" (UniqueName: \"kubernetes.io/projected/0a0f0530-95e1-4231-9933-bedb49b72a88-kube-api-access-2vn2l\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.983247 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.983290 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-dispersionconf\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.983409 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a0f0530-95e1-4231-9933-bedb49b72a88-ring-data-devices\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.984327 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a0f0530-95e1-4231-9933-bedb49b72a88-ring-data-devices\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: E0225 16:10:39.984809 4937 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 16:10:39 crc kubenswrapper[4937]: E0225 16:10:39.984835 4937 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 16:10:39 crc kubenswrapper[4937]: E0225 16:10:39.984879 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift podName:48d22af0-5579-46fb-889d-fd34e46d26e9 nodeName:}" failed. No retries permitted until 2026-02-25 16:10:40.984863466 +0000 UTC m=+1491.998255366 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift") pod "swift-storage-0" (UID: "48d22af0-5579-46fb-889d-fd34e46d26e9") : configmap "swift-ring-files" not found Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.985595 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a0f0530-95e1-4231-9933-bedb49b72a88-scripts\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.985895 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a0f0530-95e1-4231-9933-bedb49b72a88-etc-swift\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.995782 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-dispersionconf\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:39 crc kubenswrapper[4937]: I0225 16:10:39.997173 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-swiftconf\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:40 crc kubenswrapper[4937]: I0225 16:10:40.001034 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-combined-ca-bundle\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:40 crc kubenswrapper[4937]: I0225 16:10:40.010998 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vn2l\" (UniqueName: \"kubernetes.io/projected/0a0f0530-95e1-4231-9933-bedb49b72a88-kube-api-access-2vn2l\") pod \"swift-ring-rebalance-6nkbm\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:40 crc kubenswrapper[4937]: I0225 16:10:40.271334 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:10:40 crc kubenswrapper[4937]: I0225 16:10:40.403905 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:40 crc kubenswrapper[4937]: I0225 16:10:40.616779 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"123b3439-f7ab-44b4-bbed-02539668cf80","Type":"ContainerStarted","Data":"4dbca555f9134b7a4389d42a0b055f47f69662d14f35128b73fa63a342611e92"} Feb 25 16:10:40 crc kubenswrapper[4937]: I0225 16:10:40.617141 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:10:40 crc kubenswrapper[4937]: I0225 16:10:40.619442 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-6xfpd" event={"ID":"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7","Type":"ContainerStarted","Data":"dca5c99c0ff4635c5eebdf665c288bff87fa4b0ee4dbd978ff021493176fd38a"} Feb 25 16:10:40 crc kubenswrapper[4937]: I0225 16:10:40.620724 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:10:40 crc kubenswrapper[4937]: I0225 16:10:40.638915 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=-9223372005.215883 podStartE2EDuration="31.638892919s" podCreationTimestamp="2026-02-25 16:10:09 +0000 UTC" firstStartedPulling="2026-02-25 16:10:23.764075066 +0000 UTC m=+1474.777466946" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:10:40.633717979 +0000 UTC m=+1491.647109869" watchObservedRunningTime="2026-02-25 16:10:40.638892919 +0000 UTC m=+1491.652284799" Feb 25 16:10:40 crc kubenswrapper[4937]: I0225 16:10:40.655619 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-6xfpd" podStartSLOduration=2.6555907579999998 podStartE2EDuration="2.655590758s" podCreationTimestamp="2026-02-25 16:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:10:40.651276949 +0000 UTC m=+1491.664668849" watchObservedRunningTime="2026-02-25 16:10:40.655590758 +0000 UTC m=+1491.668982648" Feb 25 16:10:40 crc kubenswrapper[4937]: I0225 16:10:40.663028 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 25 16:10:40 crc kubenswrapper[4937]: I0225 16:10:40.766507 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6nkbm"] Feb 25 16:10:40 crc kubenswrapper[4937]: W0225 16:10:40.768462 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a0f0530_95e1_4231_9933_bedb49b72a88.slice/crio-ea22e5cce9aff8dddd101788c467357c52c58b4ffe43e782228d224a978ec10a WatchSource:0}: Error finding container ea22e5cce9aff8dddd101788c467357c52c58b4ffe43e782228d224a978ec10a: Status 404 returned error can't find the container with id ea22e5cce9aff8dddd101788c467357c52c58b4ffe43e782228d224a978ec10a Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.004318 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:41 crc kubenswrapper[4937]: E0225 16:10:41.004671 4937 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 16:10:41 crc kubenswrapper[4937]: E0225 16:10:41.004722 4937 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 16:10:41 crc kubenswrapper[4937]: E0225 16:10:41.004807 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift podName:48d22af0-5579-46fb-889d-fd34e46d26e9 nodeName:}" failed. No retries permitted until 2026-02-25 16:10:43.004780981 +0000 UTC m=+1494.018172891 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift") pod "swift-storage-0" (UID: "48d22af0-5579-46fb-889d-fd34e46d26e9") : configmap "swift-ring-files" not found Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.381378 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc63570-c593-4ef8-bad6-efe017070990" path="/var/lib/kubelet/pods/6bc63570-c593-4ef8-bad6-efe017070990/volumes" Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.403834 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.444149 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.494619 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.494685 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.494742 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.495577 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"710133016a8fda213d788ff3f0a0661f137f661d0c6764233454878cf67045e1"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.495633 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://710133016a8fda213d788ff3f0a0661f137f661d0c6764233454878cf67045e1" gracePeriod=600 Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.644991 4937 generic.go:334] "Generic (PLEG): container finished" podID="e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe" containerID="986aef7e61edd7d3ded7eb3186350b7282ecbe6821e865a04eef05e2444099ac" exitCode=0 Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.645121 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe","Type":"ContainerDied","Data":"986aef7e61edd7d3ded7eb3186350b7282ecbe6821e865a04eef05e2444099ac"} Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.650219 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6nkbm" event={"ID":"0a0f0530-95e1-4231-9933-bedb49b72a88","Type":"ContainerStarted","Data":"ea22e5cce9aff8dddd101788c467357c52c58b4ffe43e782228d224a978ec10a"} Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.654512 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="710133016a8fda213d788ff3f0a0661f137f661d0c6764233454878cf67045e1" exitCode=0 Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.654583 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"710133016a8fda213d788ff3f0a0661f137f661d0c6764233454878cf67045e1"} Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.654644 4937 scope.go:117] "RemoveContainer" containerID="82d7f39c6bdd0c324e2d3b37551824fca9f991542926e3b5f5cc2a5a3ef74dfb" Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.656839 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.716605 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.950679 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.952456 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.959214 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.959418 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.959551 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.960077 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-98lnx" Feb 25 16:10:41 crc kubenswrapper[4937]: I0225 16:10:41.969329 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.034990 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.035448 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-scripts\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.035507 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.035560 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-config\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.035585 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.035605 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tfvq\" (UniqueName: \"kubernetes.io/projected/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-kube-api-access-5tfvq\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.035628 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.138309 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.138439 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-config\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.138478 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.138528 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tfvq\" (UniqueName: \"kubernetes.io/projected/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-kube-api-access-5tfvq\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.138595 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.138689 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.138737 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-scripts\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.140012 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-scripts\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.140138 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.141906 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-config\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.145877 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.148061 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.148109 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.159241 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tfvq\" (UniqueName: \"kubernetes.io/projected/8ce79682-d3ee-4afb-ba50-fdacc0fe6910-kube-api-access-5tfvq\") pod \"ovn-northd-0\" (UID: \"8ce79682-d3ee-4afb-ba50-fdacc0fe6910\") " pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.306084 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.517271 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.625653 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.676356 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e"} Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.683533 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe","Type":"ContainerStarted","Data":"783f12c7e602be649846f6aeeda521b6e12d89f616cafaff8d54616710e2a6b1"} Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.726101 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371987.128693 podStartE2EDuration="49.726083427s" podCreationTimestamp="2026-02-25 16:09:53 +0000 UTC" firstStartedPulling="2026-02-25 16:09:55.050830391 +0000 UTC m=+1446.064222271" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:10:42.71863671 +0000 UTC m=+1493.732028620" watchObservedRunningTime="2026-02-25 16:10:42.726083427 +0000 UTC m=+1493.739475317" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.784791 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="123b3439-f7ab-44b4-bbed-02539668cf80" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.127:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 16:10:42 crc kubenswrapper[4937]: I0225 16:10:42.809171 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 25 16:10:43 crc kubenswrapper[4937]: I0225 16:10:43.067367 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:43 crc kubenswrapper[4937]: E0225 16:10:43.067569 4937 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 16:10:43 crc kubenswrapper[4937]: E0225 16:10:43.067595 4937 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 16:10:43 crc kubenswrapper[4937]: E0225 16:10:43.067656 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift podName:48d22af0-5579-46fb-889d-fd34e46d26e9 nodeName:}" failed. No retries permitted until 2026-02-25 16:10:47.067636158 +0000 UTC m=+1498.081028058 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift") pod "swift-storage-0" (UID: "48d22af0-5579-46fb-889d-fd34e46d26e9") : configmap "swift-ring-files" not found Feb 25 16:10:43 crc kubenswrapper[4937]: I0225 16:10:43.695448 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8ce79682-d3ee-4afb-ba50-fdacc0fe6910","Type":"ContainerStarted","Data":"8c95a7069e61697d723beddf7d1b812db159b9cebe3d44bad5586b776d598425"} Feb 25 16:10:43 crc kubenswrapper[4937]: I0225 16:10:43.697058 4937 generic.go:334] "Generic (PLEG): container finished" podID="d773f4d2-bec3-4379-a7a2-29975a18c85b" containerID="4cd0ad3573b75b13bcc09c4585a1fd172977199904c71b28892a8264d25edd65" exitCode=0 Feb 25 16:10:43 crc kubenswrapper[4937]: I0225 16:10:43.697155 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d773f4d2-bec3-4379-a7a2-29975a18c85b","Type":"ContainerDied","Data":"4cd0ad3573b75b13bcc09c4585a1fd172977199904c71b28892a8264d25edd65"} Feb 25 16:10:44 crc kubenswrapper[4937]: I0225 16:10:44.387791 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 25 16:10:44 crc kubenswrapper[4937]: I0225 16:10:44.388180 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 25 16:10:44 crc kubenswrapper[4937]: I0225 16:10:44.501466 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xq5ml"] Feb 25 16:10:44 crc kubenswrapper[4937]: I0225 16:10:44.504152 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xq5ml" Feb 25 16:10:44 crc kubenswrapper[4937]: I0225 16:10:44.506640 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 25 16:10:44 crc kubenswrapper[4937]: I0225 16:10:44.512390 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xq5ml"] Feb 25 16:10:44 crc kubenswrapper[4937]: I0225 16:10:44.599992 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1dea176-9856-4896-ae32-5b34a616da98-operator-scripts\") pod \"root-account-create-update-xq5ml\" (UID: \"f1dea176-9856-4896-ae32-5b34a616da98\") " pod="openstack/root-account-create-update-xq5ml" Feb 25 16:10:44 crc kubenswrapper[4937]: I0225 16:10:44.600165 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf4jr\" (UniqueName: \"kubernetes.io/projected/f1dea176-9856-4896-ae32-5b34a616da98-kube-api-access-hf4jr\") pod \"root-account-create-update-xq5ml\" (UID: \"f1dea176-9856-4896-ae32-5b34a616da98\") " pod="openstack/root-account-create-update-xq5ml" Feb 25 16:10:44 crc kubenswrapper[4937]: I0225 16:10:44.702380 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1dea176-9856-4896-ae32-5b34a616da98-operator-scripts\") pod \"root-account-create-update-xq5ml\" (UID: \"f1dea176-9856-4896-ae32-5b34a616da98\") " pod="openstack/root-account-create-update-xq5ml" Feb 25 16:10:44 crc kubenswrapper[4937]: I0225 16:10:44.702475 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf4jr\" (UniqueName: \"kubernetes.io/projected/f1dea176-9856-4896-ae32-5b34a616da98-kube-api-access-hf4jr\") pod \"root-account-create-update-xq5ml\" (UID: \"f1dea176-9856-4896-ae32-5b34a616da98\") " pod="openstack/root-account-create-update-xq5ml" Feb 25 16:10:44 crc kubenswrapper[4937]: I0225 16:10:44.703345 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1dea176-9856-4896-ae32-5b34a616da98-operator-scripts\") pod \"root-account-create-update-xq5ml\" (UID: \"f1dea176-9856-4896-ae32-5b34a616da98\") " pod="openstack/root-account-create-update-xq5ml" Feb 25 16:10:44 crc kubenswrapper[4937]: I0225 16:10:44.727773 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf4jr\" (UniqueName: \"kubernetes.io/projected/f1dea176-9856-4896-ae32-5b34a616da98-kube-api-access-hf4jr\") pod \"root-account-create-update-xq5ml\" (UID: \"f1dea176-9856-4896-ae32-5b34a616da98\") " pod="openstack/root-account-create-update-xq5ml" Feb 25 16:10:44 crc kubenswrapper[4937]: I0225 16:10:44.831547 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xq5ml" Feb 25 16:10:46 crc kubenswrapper[4937]: I0225 16:10:46.281205 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:46 crc kubenswrapper[4937]: I0225 16:10:46.982079 4937 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-wlfqx container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 16:10:46 crc kubenswrapper[4937]: I0225 16:10:46.982135 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-wlfqx" podUID="2bfcb195-48c8-46cd-b417-aacb40f615f4" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 16:10:47 crc kubenswrapper[4937]: I0225 16:10:47.139807 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-m4kc4" podUID="2e84eec9-8ff5-4f02-9596-e468e289dba0" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 16:10:47 crc kubenswrapper[4937]: I0225 16:10:47.139876 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-687f57d79b-m4kc4" podUID="2e84eec9-8ff5-4f02-9596-e468e289dba0" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 16:10:47 crc kubenswrapper[4937]: I0225 16:10:47.149102 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:47 crc kubenswrapper[4937]: E0225 16:10:47.149368 4937 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 16:10:47 crc kubenswrapper[4937]: E0225 16:10:47.149398 4937 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 16:10:47 crc kubenswrapper[4937]: E0225 16:10:47.149458 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift podName:48d22af0-5579-46fb-889d-fd34e46d26e9 nodeName:}" failed. No retries permitted until 2026-02-25 16:10:55.149439073 +0000 UTC m=+1506.162830973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift") pod "swift-storage-0" (UID: "48d22af0-5579-46fb-889d-fd34e46d26e9") : configmap "swift-ring-files" not found Feb 25 16:10:47 crc kubenswrapper[4937]: I0225 16:10:47.325381 4937 generic.go:334] "Generic (PLEG): container finished" podID="9f90fdcc-629f-46e9-9485-de80d43ea155" containerID="e32308d0642d3c8905970599b30df4838643779f555870be4d3c24d9df44e71d" exitCode=0 Feb 25 16:10:47 crc kubenswrapper[4937]: I0225 16:10:47.325427 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9f90fdcc-629f-46e9-9485-de80d43ea155","Type":"ContainerDied","Data":"e32308d0642d3c8905970599b30df4838643779f555870be4d3c24d9df44e71d"} Feb 25 16:10:48 crc kubenswrapper[4937]: I0225 16:10:48.388976 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 25 16:10:48 crc kubenswrapper[4937]: I0225 16:10:48.550803 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:10:48 crc kubenswrapper[4937]: I0225 16:10:48.604302 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-ztrnz"] Feb 25 16:10:48 crc kubenswrapper[4937]: I0225 16:10:48.604731 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" podUID="215bd06c-8d29-4761-ac40-911e2f9fcd73" containerName="dnsmasq-dns" containerID="cri-o://98a65cc81711341e7ef85196cc49fec21f8c06d1c9955b78d7a5485722104a36" gracePeriod=10 Feb 25 16:10:51 crc kubenswrapper[4937]: I0225 16:10:51.149770 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 25 16:10:51 crc kubenswrapper[4937]: I0225 16:10:51.280166 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" podUID="215bd06c-8d29-4761-ac40-911e2f9fcd73" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Feb 25 16:10:51 crc kubenswrapper[4937]: I0225 16:10:51.356407 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="08382e6d-e8e5-4656-a524-26c8269114fd" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 25 16:10:51 crc kubenswrapper[4937]: I0225 16:10:51.696100 4937 generic.go:334] "Generic (PLEG): container finished" podID="215bd06c-8d29-4761-ac40-911e2f9fcd73" containerID="98a65cc81711341e7ef85196cc49fec21f8c06d1c9955b78d7a5485722104a36" exitCode=0 Feb 25 16:10:51 crc kubenswrapper[4937]: I0225 16:10:51.696156 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" event={"ID":"215bd06c-8d29-4761-ac40-911e2f9fcd73","Type":"ContainerDied","Data":"98a65cc81711341e7ef85196cc49fec21f8c06d1c9955b78d7a5485722104a36"} Feb 25 16:10:53 crc kubenswrapper[4937]: I0225 16:10:53.184708 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xq5ml"] Feb 25 16:10:53 crc kubenswrapper[4937]: W0225 16:10:53.193882 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1dea176_9856_4896_ae32_5b34a616da98.slice/crio-a51cb7a7bb15d6ce2f6948f3b9853c6789c02ef0058a3202db65a5e45e29b428 WatchSource:0}: Error finding container a51cb7a7bb15d6ce2f6948f3b9853c6789c02ef0058a3202db65a5e45e29b428: Status 404 returned error can't find the container with id a51cb7a7bb15d6ce2f6948f3b9853c6789c02ef0058a3202db65a5e45e29b428 Feb 25 16:10:53 crc kubenswrapper[4937]: I0225 16:10:53.201594 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 25 16:10:53 crc kubenswrapper[4937]: I0225 16:10:53.719352 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xq5ml" event={"ID":"f1dea176-9856-4896-ae32-5b34a616da98","Type":"ContainerStarted","Data":"a51cb7a7bb15d6ce2f6948f3b9853c6789c02ef0058a3202db65a5e45e29b428"} Feb 25 16:10:55 crc kubenswrapper[4937]: E0225 16:10:55.117922 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified" Feb 25 16:10:55 crc kubenswrapper[4937]: E0225 16:10:55.118667 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:swift-ring-rebalance,Image:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,Command:[/usr/local/bin/swift-ring-tool all],Args:[],WorkingDir:/etc/swift,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CM_NAME,Value:swift-ring-files,ValueFrom:nil,},EnvVar{Name:NAMESPACE,Value:openstack,ValueFrom:nil,},EnvVar{Name:OWNER_APIVERSION,Value:swift.openstack.org/v1beta1,ValueFrom:nil,},EnvVar{Name:OWNER_KIND,Value:SwiftRing,ValueFrom:nil,},EnvVar{Name:OWNER_NAME,Value:swift-ring,ValueFrom:nil,},EnvVar{Name:OWNER_UID,Value:6da70b09-4afb-472f-94f9-7342c515ac24,ValueFrom:nil,},EnvVar{Name:SWIFT_MIN_PART_HOURS,Value:1,ValueFrom:nil,},EnvVar{Name:SWIFT_PART_POWER,Value:10,ValueFrom:nil,},EnvVar{Name:SWIFT_REPLICAS,Value:1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/swift-ring-tool,SubPath:swift-ring-tool,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:swiftconf,ReadOnly:true,MountPath:/etc/swift/swift.conf,SubPath:swift.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-swift,ReadOnly:false,MountPath:/etc/swift,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ring-data-devices,ReadOnly:true,MountPath:/var/lib/config-data/ring-devices,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dispersionconf,ReadOnly:true,MountPath:/etc/swift/dispersion.conf,SubPath:dispersion.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2vn2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42445,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-ring-rebalance-6nkbm_openstack(0a0f0530-95e1-4231-9933-bedb49b72a88): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:10:55 crc kubenswrapper[4937]: E0225 16:10:55.119889 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"swift-ring-rebalance\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/swift-ring-rebalance-6nkbm" podUID="0a0f0530-95e1-4231-9933-bedb49b72a88" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.166870 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:10:55 crc kubenswrapper[4937]: E0225 16:10:55.167075 4937 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 16:10:55 crc kubenswrapper[4937]: E0225 16:10:55.167107 4937 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 16:10:55 crc kubenswrapper[4937]: E0225 16:10:55.167169 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift podName:48d22af0-5579-46fb-889d-fd34e46d26e9 nodeName:}" failed. No retries permitted until 2026-02-25 16:11:11.167152473 +0000 UTC m=+1522.180544363 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift") pod "swift-storage-0" (UID: "48d22af0-5579-46fb-889d-fd34e46d26e9") : configmap "swift-ring-files" not found Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.394431 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.404889 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.573403 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52qj2\" (UniqueName: \"kubernetes.io/projected/9f90fdcc-629f-46e9-9485-de80d43ea155-kube-api-access-52qj2\") pod \"9f90fdcc-629f-46e9-9485-de80d43ea155\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.573514 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-0\") pod \"9f90fdcc-629f-46e9-9485-de80d43ea155\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.573564 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdcc-629f-46e9-9485-de80d43ea155-config-out\") pod \"9f90fdcc-629f-46e9-9485-de80d43ea155\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.573622 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-dns-svc\") pod \"215bd06c-8d29-4761-ac40-911e2f9fcd73\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.573800 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") pod \"9f90fdcc-629f-46e9-9485-de80d43ea155\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.573845 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f90fdcc-629f-46e9-9485-de80d43ea155-tls-assets\") pod \"9f90fdcc-629f-46e9-9485-de80d43ea155\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.573875 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-2\") pod \"9f90fdcc-629f-46e9-9485-de80d43ea155\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.573890 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-config\") pod \"9f90fdcc-629f-46e9-9485-de80d43ea155\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.573914 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-thanos-prometheus-http-client-file\") pod \"9f90fdcc-629f-46e9-9485-de80d43ea155\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.573933 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-config\") pod \"215bd06c-8d29-4761-ac40-911e2f9fcd73\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.573963 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48ntq\" (UniqueName: \"kubernetes.io/projected/215bd06c-8d29-4761-ac40-911e2f9fcd73-kube-api-access-48ntq\") pod \"215bd06c-8d29-4761-ac40-911e2f9fcd73\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.573985 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-ovsdbserver-nb\") pod \"215bd06c-8d29-4761-ac40-911e2f9fcd73\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.574007 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-web-config\") pod \"9f90fdcc-629f-46e9-9485-de80d43ea155\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.574044 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-1\") pod \"9f90fdcc-629f-46e9-9485-de80d43ea155\" (UID: \"9f90fdcc-629f-46e9-9485-de80d43ea155\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.574065 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-ovsdbserver-sb\") pod \"215bd06c-8d29-4761-ac40-911e2f9fcd73\" (UID: \"215bd06c-8d29-4761-ac40-911e2f9fcd73\") " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.574463 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "9f90fdcc-629f-46e9-9485-de80d43ea155" (UID: "9f90fdcc-629f-46e9-9485-de80d43ea155"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.576745 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "9f90fdcc-629f-46e9-9485-de80d43ea155" (UID: "9f90fdcc-629f-46e9-9485-de80d43ea155"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.582593 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-config" (OuterVolumeSpecName: "config") pod "9f90fdcc-629f-46e9-9485-de80d43ea155" (UID: "9f90fdcc-629f-46e9-9485-de80d43ea155"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.583471 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f90fdcc-629f-46e9-9485-de80d43ea155-kube-api-access-52qj2" (OuterVolumeSpecName: "kube-api-access-52qj2") pod "9f90fdcc-629f-46e9-9485-de80d43ea155" (UID: "9f90fdcc-629f-46e9-9485-de80d43ea155"). InnerVolumeSpecName "kube-api-access-52qj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.583794 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "9f90fdcc-629f-46e9-9485-de80d43ea155" (UID: "9f90fdcc-629f-46e9-9485-de80d43ea155"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.584642 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215bd06c-8d29-4761-ac40-911e2f9fcd73-kube-api-access-48ntq" (OuterVolumeSpecName: "kube-api-access-48ntq") pod "215bd06c-8d29-4761-ac40-911e2f9fcd73" (UID: "215bd06c-8d29-4761-ac40-911e2f9fcd73"). InnerVolumeSpecName "kube-api-access-48ntq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.584645 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-web-config" (OuterVolumeSpecName: "web-config") pod "9f90fdcc-629f-46e9-9485-de80d43ea155" (UID: "9f90fdcc-629f-46e9-9485-de80d43ea155"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.587845 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f90fdcc-629f-46e9-9485-de80d43ea155-config-out" (OuterVolumeSpecName: "config-out") pod "9f90fdcc-629f-46e9-9485-de80d43ea155" (UID: "9f90fdcc-629f-46e9-9485-de80d43ea155"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.588351 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f90fdcc-629f-46e9-9485-de80d43ea155-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9f90fdcc-629f-46e9-9485-de80d43ea155" (UID: "9f90fdcc-629f-46e9-9485-de80d43ea155"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.597664 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "9f90fdcc-629f-46e9-9485-de80d43ea155" (UID: "9f90fdcc-629f-46e9-9485-de80d43ea155"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.599961 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "9f90fdcc-629f-46e9-9485-de80d43ea155" (UID: "9f90fdcc-629f-46e9-9485-de80d43ea155"). InnerVolumeSpecName "pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.628210 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-config" (OuterVolumeSpecName: "config") pod "215bd06c-8d29-4761-ac40-911e2f9fcd73" (UID: "215bd06c-8d29-4761-ac40-911e2f9fcd73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.637704 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "215bd06c-8d29-4761-ac40-911e2f9fcd73" (UID: "215bd06c-8d29-4761-ac40-911e2f9fcd73"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.641145 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "215bd06c-8d29-4761-ac40-911e2f9fcd73" (UID: "215bd06c-8d29-4761-ac40-911e2f9fcd73"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.651977 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "215bd06c-8d29-4761-ac40-911e2f9fcd73" (UID: "215bd06c-8d29-4761-ac40-911e2f9fcd73"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.676933 4937 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") on node \"crc\" " Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.680315 4937 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9f90fdcc-629f-46e9-9485-de80d43ea155-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.680341 4937 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.680360 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.680370 4937 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.680380 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.680390 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48ntq\" (UniqueName: \"kubernetes.io/projected/215bd06c-8d29-4761-ac40-911e2f9fcd73-kube-api-access-48ntq\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.680399 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.680408 4937 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9f90fdcc-629f-46e9-9485-de80d43ea155-web-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.680417 4937 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.680425 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.680436 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52qj2\" (UniqueName: \"kubernetes.io/projected/9f90fdcc-629f-46e9-9485-de80d43ea155-kube-api-access-52qj2\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.680445 4937 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9f90fdcc-629f-46e9-9485-de80d43ea155-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.680454 4937 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdcc-629f-46e9-9485-de80d43ea155-config-out\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.680462 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/215bd06c-8d29-4761-ac40-911e2f9fcd73-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.697912 4937 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.698210 4937 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4") on node "crc" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.737793 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" event={"ID":"215bd06c-8d29-4761-ac40-911e2f9fcd73","Type":"ContainerDied","Data":"f3e37f6b53165fa9bcce829fedef12ea4cfc2e4ec669e5b51f3180e2946dbe3c"} Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.737906 4937 scope.go:117] "RemoveContainer" containerID="98a65cc81711341e7ef85196cc49fec21f8c06d1c9955b78d7a5485722104a36" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.738029 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-ztrnz" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.747834 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9f90fdcc-629f-46e9-9485-de80d43ea155","Type":"ContainerDied","Data":"438dffc0133f7ac2f50c675b2864d60579b1e4a90b9b3ec7c4a4c9e908b21271"} Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.747881 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.750607 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xq5ml" event={"ID":"f1dea176-9856-4896-ae32-5b34a616da98","Type":"ContainerStarted","Data":"29e6688b03fc9b75617fe3e732dcc1252c3dcb1e14b062bb630fe55ccd5b9d82"} Feb 25 16:10:55 crc kubenswrapper[4937]: E0225 16:10:55.753559 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"swift-ring-rebalance\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified\\\"\"" pod="openstack/swift-ring-rebalance-6nkbm" podUID="0a0f0530-95e1-4231-9933-bedb49b72a88" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.770715 4937 scope.go:117] "RemoveContainer" containerID="ea7075aff1e1f8be4ecf2beeb84ea13807662c8c37a50c90aaf98862f00803ef" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.773934 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-xq5ml" podStartSLOduration=11.773913302 podStartE2EDuration="11.773913302s" podCreationTimestamp="2026-02-25 16:10:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:10:55.770460836 +0000 UTC m=+1506.783852746" watchObservedRunningTime="2026-02-25 16:10:55.773913302 +0000 UTC m=+1506.787305182" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.782031 4937 reconciler_common.go:293] "Volume detached for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") on node \"crc\" DevicePath \"\"" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.795894 4937 scope.go:117] "RemoveContainer" containerID="e32308d0642d3c8905970599b30df4838643779f555870be4d3c24d9df44e71d" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.838060 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.860359 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.874836 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-ztrnz"] Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.889985 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-ztrnz"] Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.893583 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:10:55 crc kubenswrapper[4937]: E0225 16:10:55.894003 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="215bd06c-8d29-4761-ac40-911e2f9fcd73" containerName="dnsmasq-dns" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.894022 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="215bd06c-8d29-4761-ac40-911e2f9fcd73" containerName="dnsmasq-dns" Feb 25 16:10:55 crc kubenswrapper[4937]: E0225 16:10:55.894055 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f90fdcc-629f-46e9-9485-de80d43ea155" containerName="init-config-reloader" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.894064 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f90fdcc-629f-46e9-9485-de80d43ea155" containerName="init-config-reloader" Feb 25 16:10:55 crc kubenswrapper[4937]: E0225 16:10:55.894091 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="215bd06c-8d29-4761-ac40-911e2f9fcd73" containerName="init" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.894099 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="215bd06c-8d29-4761-ac40-911e2f9fcd73" containerName="init" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.894316 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f90fdcc-629f-46e9-9485-de80d43ea155" containerName="init-config-reloader" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.894336 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="215bd06c-8d29-4761-ac40-911e2f9fcd73" containerName="dnsmasq-dns" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.896753 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.900605 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.901355 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.901967 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.902048 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.901967 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.902167 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.902991 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-8sz4b" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.904044 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.932241 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.988783 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.988944 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.989173 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.989218 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.989252 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e19ac505-41d9-4d1d-b75a-0c88e26960c8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.989330 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e19ac505-41d9-4d1d-b75a-0c88e26960c8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.989407 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-config\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.989531 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.989634 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmxns\" (UniqueName: \"kubernetes.io/projected/e19ac505-41d9-4d1d-b75a-0c88e26960c8-kube-api-access-zmxns\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:55 crc kubenswrapper[4937]: I0225 16:10:55.989658 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.091173 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e19ac505-41d9-4d1d-b75a-0c88e26960c8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.091259 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-config\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.091291 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.091331 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmxns\" (UniqueName: \"kubernetes.io/projected/e19ac505-41d9-4d1d-b75a-0c88e26960c8-kube-api-access-zmxns\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.091368 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.091412 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.091447 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.091527 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.091555 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.091573 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e19ac505-41d9-4d1d-b75a-0c88e26960c8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.093007 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.093346 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.093666 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.095377 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.095933 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e19ac505-41d9-4d1d-b75a-0c88e26960c8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.096165 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e19ac505-41d9-4d1d-b75a-0c88e26960c8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.096276 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.096309 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bcbb07c66c16a8bfff5bebc3b08f557f1544f8693883973b0b05074d97af7e5f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.096334 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-config\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.101209 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.110147 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmxns\" (UniqueName: \"kubernetes.io/projected/e19ac505-41d9-4d1d-b75a-0c88e26960c8-kube-api-access-zmxns\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.134309 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") pod \"prometheus-metric-storage-0\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.240513 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 25 16:10:56 crc kubenswrapper[4937]: I0225 16:10:56.868892 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:10:57 crc kubenswrapper[4937]: I0225 16:10:57.380614 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="215bd06c-8d29-4761-ac40-911e2f9fcd73" path="/var/lib/kubelet/pods/215bd06c-8d29-4761-ac40-911e2f9fcd73/volumes" Feb 25 16:10:57 crc kubenswrapper[4937]: I0225 16:10:57.382141 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f90fdcc-629f-46e9-9485-de80d43ea155" path="/var/lib/kubelet/pods/9f90fdcc-629f-46e9-9485-de80d43ea155/volumes" Feb 25 16:10:57 crc kubenswrapper[4937]: I0225 16:10:57.789593 4937 generic.go:334] "Generic (PLEG): container finished" podID="f1dea176-9856-4896-ae32-5b34a616da98" containerID="29e6688b03fc9b75617fe3e732dcc1252c3dcb1e14b062bb630fe55ccd5b9d82" exitCode=0 Feb 25 16:10:57 crc kubenswrapper[4937]: I0225 16:10:57.789686 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xq5ml" event={"ID":"f1dea176-9856-4896-ae32-5b34a616da98","Type":"ContainerDied","Data":"29e6688b03fc9b75617fe3e732dcc1252c3dcb1e14b062bb630fe55ccd5b9d82"} Feb 25 16:10:57 crc kubenswrapper[4937]: I0225 16:10:57.791454 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e19ac505-41d9-4d1d-b75a-0c88e26960c8","Type":"ContainerStarted","Data":"4b993c8a26f5fc338f51f2888f7966159661bbf45b30ab38232fb63135a2c92d"} Feb 25 16:10:59 crc kubenswrapper[4937]: I0225 16:10:59.518013 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-mplc7" Feb 25 16:10:59 crc kubenswrapper[4937]: I0225 16:10:59.592365 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 25 16:10:59 crc kubenswrapper[4937]: I0225 16:10:59.681142 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-86646" Feb 25 16:10:59 crc kubenswrapper[4937]: I0225 16:10:59.760804 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 25 16:10:59 crc kubenswrapper[4937]: I0225 16:10:59.897054 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p" Feb 25 16:11:00 crc kubenswrapper[4937]: I0225 16:11:00.790974 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.013057 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="08382e6d-e8e5-4656-a524-26c8269114fd" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.305368 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-zbnk5"] Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.306923 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zbnk5" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.317695 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-zbnk5"] Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.411546 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1246-account-create-update-c8jvn"] Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.413206 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1246-account-create-update-c8jvn" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.424979 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.426126 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c163075d-e1f7-4252-92aa-17b9bbfe336a-operator-scripts\") pod \"glance-db-create-zbnk5\" (UID: \"c163075d-e1f7-4252-92aa-17b9bbfe336a\") " pod="openstack/glance-db-create-zbnk5" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.426154 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-httb9\" (UniqueName: \"kubernetes.io/projected/c163075d-e1f7-4252-92aa-17b9bbfe336a-kube-api-access-httb9\") pod \"glance-db-create-zbnk5\" (UID: \"c163075d-e1f7-4252-92aa-17b9bbfe336a\") " pod="openstack/glance-db-create-zbnk5" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.435783 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1246-account-create-update-c8jvn"] Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.527588 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txwlm\" (UniqueName: \"kubernetes.io/projected/2530391c-1cbc-4c0c-ab27-bba9cfcc5149-kube-api-access-txwlm\") pod \"glance-1246-account-create-update-c8jvn\" (UID: \"2530391c-1cbc-4c0c-ab27-bba9cfcc5149\") " pod="openstack/glance-1246-account-create-update-c8jvn" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.527758 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c163075d-e1f7-4252-92aa-17b9bbfe336a-operator-scripts\") pod \"glance-db-create-zbnk5\" (UID: \"c163075d-e1f7-4252-92aa-17b9bbfe336a\") " pod="openstack/glance-db-create-zbnk5" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.527790 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-httb9\" (UniqueName: \"kubernetes.io/projected/c163075d-e1f7-4252-92aa-17b9bbfe336a-kube-api-access-httb9\") pod \"glance-db-create-zbnk5\" (UID: \"c163075d-e1f7-4252-92aa-17b9bbfe336a\") " pod="openstack/glance-db-create-zbnk5" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.528027 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2530391c-1cbc-4c0c-ab27-bba9cfcc5149-operator-scripts\") pod \"glance-1246-account-create-update-c8jvn\" (UID: \"2530391c-1cbc-4c0c-ab27-bba9cfcc5149\") " pod="openstack/glance-1246-account-create-update-c8jvn" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.528900 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c163075d-e1f7-4252-92aa-17b9bbfe336a-operator-scripts\") pod \"glance-db-create-zbnk5\" (UID: \"c163075d-e1f7-4252-92aa-17b9bbfe336a\") " pod="openstack/glance-db-create-zbnk5" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.545784 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-httb9\" (UniqueName: \"kubernetes.io/projected/c163075d-e1f7-4252-92aa-17b9bbfe336a-kube-api-access-httb9\") pod \"glance-db-create-zbnk5\" (UID: \"c163075d-e1f7-4252-92aa-17b9bbfe336a\") " pod="openstack/glance-db-create-zbnk5" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.629746 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2530391c-1cbc-4c0c-ab27-bba9cfcc5149-operator-scripts\") pod \"glance-1246-account-create-update-c8jvn\" (UID: \"2530391c-1cbc-4c0c-ab27-bba9cfcc5149\") " pod="openstack/glance-1246-account-create-update-c8jvn" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.630081 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txwlm\" (UniqueName: \"kubernetes.io/projected/2530391c-1cbc-4c0c-ab27-bba9cfcc5149-kube-api-access-txwlm\") pod \"glance-1246-account-create-update-c8jvn\" (UID: \"2530391c-1cbc-4c0c-ab27-bba9cfcc5149\") " pod="openstack/glance-1246-account-create-update-c8jvn" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.630411 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2530391c-1cbc-4c0c-ab27-bba9cfcc5149-operator-scripts\") pod \"glance-1246-account-create-update-c8jvn\" (UID: \"2530391c-1cbc-4c0c-ab27-bba9cfcc5149\") " pod="openstack/glance-1246-account-create-update-c8jvn" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.640657 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zbnk5" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.646873 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txwlm\" (UniqueName: \"kubernetes.io/projected/2530391c-1cbc-4c0c-ab27-bba9cfcc5149-kube-api-access-txwlm\") pod \"glance-1246-account-create-update-c8jvn\" (UID: \"2530391c-1cbc-4c0c-ab27-bba9cfcc5149\") " pod="openstack/glance-1246-account-create-update-c8jvn" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.732389 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1246-account-create-update-c8jvn" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.823652 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xq5ml" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.826575 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xq5ml" event={"ID":"f1dea176-9856-4896-ae32-5b34a616da98","Type":"ContainerDied","Data":"a51cb7a7bb15d6ce2f6948f3b9853c6789c02ef0058a3202db65a5e45e29b428"} Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.826602 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a51cb7a7bb15d6ce2f6948f3b9853c6789c02ef0058a3202db65a5e45e29b428" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.934456 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1dea176-9856-4896-ae32-5b34a616da98-operator-scripts\") pod \"f1dea176-9856-4896-ae32-5b34a616da98\" (UID: \"f1dea176-9856-4896-ae32-5b34a616da98\") " Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.934578 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf4jr\" (UniqueName: \"kubernetes.io/projected/f1dea176-9856-4896-ae32-5b34a616da98-kube-api-access-hf4jr\") pod \"f1dea176-9856-4896-ae32-5b34a616da98\" (UID: \"f1dea176-9856-4896-ae32-5b34a616da98\") " Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.935358 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1dea176-9856-4896-ae32-5b34a616da98-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1dea176-9856-4896-ae32-5b34a616da98" (UID: "f1dea176-9856-4896-ae32-5b34a616da98"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:01 crc kubenswrapper[4937]: I0225 16:11:01.938830 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1dea176-9856-4896-ae32-5b34a616da98-kube-api-access-hf4jr" (OuterVolumeSpecName: "kube-api-access-hf4jr") pod "f1dea176-9856-4896-ae32-5b34a616da98" (UID: "f1dea176-9856-4896-ae32-5b34a616da98"). InnerVolumeSpecName "kube-api-access-hf4jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:02 crc kubenswrapper[4937]: I0225 16:11:02.036926 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1dea176-9856-4896-ae32-5b34a616da98-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:02 crc kubenswrapper[4937]: I0225 16:11:02.036963 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf4jr\" (UniqueName: \"kubernetes.io/projected/f1dea176-9856-4896-ae32-5b34a616da98-kube-api-access-hf4jr\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:02 crc kubenswrapper[4937]: I0225 16:11:02.835644 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xq5ml" Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.021471 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xq5ml"] Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.030362 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xq5ml"] Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.110222 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-lfjf9"] Feb 25 16:11:03 crc kubenswrapper[4937]: E0225 16:11:03.110629 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1dea176-9856-4896-ae32-5b34a616da98" containerName="mariadb-account-create-update" Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.110648 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1dea176-9856-4896-ae32-5b34a616da98" containerName="mariadb-account-create-update" Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.110886 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1dea176-9856-4896-ae32-5b34a616da98" containerName="mariadb-account-create-update" Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.111842 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lfjf9" Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.114169 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.121234 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lfjf9"] Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.260415 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdfcp\" (UniqueName: \"kubernetes.io/projected/f87815ec-3a3b-4521-97ef-abf88231d48b-kube-api-access-sdfcp\") pod \"root-account-create-update-lfjf9\" (UID: \"f87815ec-3a3b-4521-97ef-abf88231d48b\") " pod="openstack/root-account-create-update-lfjf9" Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.260837 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f87815ec-3a3b-4521-97ef-abf88231d48b-operator-scripts\") pod \"root-account-create-update-lfjf9\" (UID: \"f87815ec-3a3b-4521-97ef-abf88231d48b\") " pod="openstack/root-account-create-update-lfjf9" Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.363626 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdfcp\" (UniqueName: \"kubernetes.io/projected/f87815ec-3a3b-4521-97ef-abf88231d48b-kube-api-access-sdfcp\") pod \"root-account-create-update-lfjf9\" (UID: \"f87815ec-3a3b-4521-97ef-abf88231d48b\") " pod="openstack/root-account-create-update-lfjf9" Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.363915 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f87815ec-3a3b-4521-97ef-abf88231d48b-operator-scripts\") pod \"root-account-create-update-lfjf9\" (UID: \"f87815ec-3a3b-4521-97ef-abf88231d48b\") " pod="openstack/root-account-create-update-lfjf9" Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.365477 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f87815ec-3a3b-4521-97ef-abf88231d48b-operator-scripts\") pod \"root-account-create-update-lfjf9\" (UID: \"f87815ec-3a3b-4521-97ef-abf88231d48b\") " pod="openstack/root-account-create-update-lfjf9" Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.394253 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1dea176-9856-4896-ae32-5b34a616da98" path="/var/lib/kubelet/pods/f1dea176-9856-4896-ae32-5b34a616da98/volumes" Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.401814 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdfcp\" (UniqueName: \"kubernetes.io/projected/f87815ec-3a3b-4521-97ef-abf88231d48b-kube-api-access-sdfcp\") pod \"root-account-create-update-lfjf9\" (UID: \"f87815ec-3a3b-4521-97ef-abf88231d48b\") " pod="openstack/root-account-create-update-lfjf9" Feb 25 16:11:03 crc kubenswrapper[4937]: I0225 16:11:03.429991 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lfjf9" Feb 25 16:11:05 crc kubenswrapper[4937]: E0225 16:11:05.603733 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified" Feb 25 16:11:05 crc kubenswrapper[4937]: E0225 16:11:05.605051 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-northd,Image:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,Command:[/usr/bin/ovn-northd],Args:[-vfile:off -vconsole:info --n-threads=1 --ovnnb-db=ssl:ovsdbserver-nb-0.openstack.svc.cluster.local:6641 --ovnsb-db=ssl:ovsdbserver-sb-0.openstack.svc.cluster.local:6642 --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbchbh5c8h56dhf9hc5hc4h5f4h547h679h674h564h597h84h65dh7fhc5hf9h5b6h67ch685h56fh56ch5c9h5cfh55ch57h5d8h5d6hb4h596h545q,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:certs,Value:n656h697h58h65dh674h68ch5b4h5f4h695h9fh64h587h568h4h669h68dh5cfhc7h596h545hcbh677h5f8h656h5cdh7bh5cdh579hcch54bh56bh87q,ValueFrom:nil,},EnvVar{Name:certs_metrics,Value:n98hf6h5d8hcbhdbh589h59fh5fbh7ch89h57ch675h66fh58h55dhb7h66dh68ch559hc5h67bh97h58ch65dh589h67bhbdh5cfh5b8h98h555hd5q,ValueFrom:nil,},EnvVar{Name:ovnnorthd-config,Value:n5c8h7ch56bh8dh8hc4h5dch9dh68h6bhb7h598h549h5dbh66fh6bh5b4h5cch5d6h55ch57fhfch588h89h5ddh5d6h65bh65bh8dhc4h67dh569q,ValueFrom:nil,},EnvVar{Name:ovnnorthd-scripts,Value:n664hd8h66ch58dh64hc9h66bhd4h558h697h67bh557hdch664h567h669h555h696h556h556h5fh5bh569hbh665h9dh4h9bh564hc8h5b7h5c4q,ValueFrom:nil,},EnvVar{Name:tls-ca-bundle.pem,Value:n54bh596h554h687hd9h5bfh5f5h556h697h59chfch566h5ffhf4h55ch95hdbh597h89h64fh65bh555h84h587h5c5h86h654h66fh554hf7h59h698q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5tfvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/status_check.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:15,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/status_check.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:15,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-northd-0_openstack(8ce79682-d3ee-4afb-ba50-fdacc0fe6910): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:11:06 crc kubenswrapper[4937]: E0225 16:11:06.412671 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-northd-0" podUID="8ce79682-d3ee-4afb-ba50-fdacc0fe6910" Feb 25 16:11:06 crc kubenswrapper[4937]: I0225 16:11:06.636250 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-zbnk5"] Feb 25 16:11:06 crc kubenswrapper[4937]: W0225 16:11:06.637772 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc163075d_e1f7_4252_92aa_17b9bbfe336a.slice/crio-2770a7df323fd0f135ce59045b58fd507fea558be50afe2341a3ca0bc96676eb WatchSource:0}: Error finding container 2770a7df323fd0f135ce59045b58fd507fea558be50afe2341a3ca0bc96676eb: Status 404 returned error can't find the container with id 2770a7df323fd0f135ce59045b58fd507fea558be50afe2341a3ca0bc96676eb Feb 25 16:11:06 crc kubenswrapper[4937]: I0225 16:11:06.645793 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1246-account-create-update-c8jvn"] Feb 25 16:11:06 crc kubenswrapper[4937]: I0225 16:11:06.840449 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lfjf9"] Feb 25 16:11:06 crc kubenswrapper[4937]: W0225 16:11:06.842810 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf87815ec_3a3b_4521_97ef_abf88231d48b.slice/crio-af1d88d3dbe35d11921d4585943f4b9ad1a513aada937bd07d1b2c812df57b42 WatchSource:0}: Error finding container af1d88d3dbe35d11921d4585943f4b9ad1a513aada937bd07d1b2c812df57b42: Status 404 returned error can't find the container with id af1d88d3dbe35d11921d4585943f4b9ad1a513aada937bd07d1b2c812df57b42 Feb 25 16:11:06 crc kubenswrapper[4937]: I0225 16:11:06.881339 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zbnk5" event={"ID":"c163075d-e1f7-4252-92aa-17b9bbfe336a","Type":"ContainerStarted","Data":"2770a7df323fd0f135ce59045b58fd507fea558be50afe2341a3ca0bc96676eb"} Feb 25 16:11:06 crc kubenswrapper[4937]: I0225 16:11:06.883112 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lfjf9" event={"ID":"f87815ec-3a3b-4521-97ef-abf88231d48b","Type":"ContainerStarted","Data":"af1d88d3dbe35d11921d4585943f4b9ad1a513aada937bd07d1b2c812df57b42"} Feb 25 16:11:06 crc kubenswrapper[4937]: I0225 16:11:06.886014 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1246-account-create-update-c8jvn" event={"ID":"2530391c-1cbc-4c0c-ab27-bba9cfcc5149","Type":"ContainerStarted","Data":"bf86df82772e405cf0e6a2de76ba0b7f401e50abb7cc408cf60e063b5e84dec4"} Feb 25 16:11:06 crc kubenswrapper[4937]: I0225 16:11:06.888371 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8ce79682-d3ee-4afb-ba50-fdacc0fe6910","Type":"ContainerStarted","Data":"f56eef5c10cbc77f511c5fd5433967c402cf08e12c79be5af73e17cd98cce064"} Feb 25 16:11:06 crc kubenswrapper[4937]: E0225 16:11:06.891302 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified\\\"\"" pod="openstack/ovn-northd-0" podUID="8ce79682-d3ee-4afb-ba50-fdacc0fe6910" Feb 25 16:11:06 crc kubenswrapper[4937]: I0225 16:11:06.895186 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d773f4d2-bec3-4379-a7a2-29975a18c85b","Type":"ContainerStarted","Data":"b5d6d58b3c04a085b07440a08fa5bf38038c6176e2c5891fe9d5d582d1355b0f"} Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.075172 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-g8745"] Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.077856 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g8745" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.084980 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-g8745"] Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.144039 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgtgp\" (UniqueName: \"kubernetes.io/projected/c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3-kube-api-access-zgtgp\") pod \"keystone-db-create-g8745\" (UID: \"c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3\") " pod="openstack/keystone-db-create-g8745" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.144152 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3-operator-scripts\") pod \"keystone-db-create-g8745\" (UID: \"c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3\") " pod="openstack/keystone-db-create-g8745" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.182406 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5d21-account-create-update-4j4wt"] Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.183993 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d21-account-create-update-4j4wt" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.187411 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.201864 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d21-account-create-update-4j4wt"] Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.246157 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2269b3b-bbaf-44bc-a77c-059195d12b86-operator-scripts\") pod \"keystone-5d21-account-create-update-4j4wt\" (UID: \"c2269b3b-bbaf-44bc-a77c-059195d12b86\") " pod="openstack/keystone-5d21-account-create-update-4j4wt" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.246204 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3-operator-scripts\") pod \"keystone-db-create-g8745\" (UID: \"c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3\") " pod="openstack/keystone-db-create-g8745" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.246238 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwc8s\" (UniqueName: \"kubernetes.io/projected/c2269b3b-bbaf-44bc-a77c-059195d12b86-kube-api-access-zwc8s\") pod \"keystone-5d21-account-create-update-4j4wt\" (UID: \"c2269b3b-bbaf-44bc-a77c-059195d12b86\") " pod="openstack/keystone-5d21-account-create-update-4j4wt" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.246327 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgtgp\" (UniqueName: \"kubernetes.io/projected/c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3-kube-api-access-zgtgp\") pod \"keystone-db-create-g8745\" (UID: \"c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3\") " pod="openstack/keystone-db-create-g8745" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.247257 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3-operator-scripts\") pod \"keystone-db-create-g8745\" (UID: \"c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3\") " pod="openstack/keystone-db-create-g8745" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.270305 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-8bftw"] Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.271645 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8bftw" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.275066 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgtgp\" (UniqueName: \"kubernetes.io/projected/c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3-kube-api-access-zgtgp\") pod \"keystone-db-create-g8745\" (UID: \"c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3\") " pod="openstack/keystone-db-create-g8745" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.282853 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8bftw"] Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.347653 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpbnv\" (UniqueName: \"kubernetes.io/projected/fdbe96df-78a6-449e-affb-3529fdc05d49-kube-api-access-zpbnv\") pod \"placement-db-create-8bftw\" (UID: \"fdbe96df-78a6-449e-affb-3529fdc05d49\") " pod="openstack/placement-db-create-8bftw" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.347703 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwc8s\" (UniqueName: \"kubernetes.io/projected/c2269b3b-bbaf-44bc-a77c-059195d12b86-kube-api-access-zwc8s\") pod \"keystone-5d21-account-create-update-4j4wt\" (UID: \"c2269b3b-bbaf-44bc-a77c-059195d12b86\") " pod="openstack/keystone-5d21-account-create-update-4j4wt" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.347827 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbe96df-78a6-449e-affb-3529fdc05d49-operator-scripts\") pod \"placement-db-create-8bftw\" (UID: \"fdbe96df-78a6-449e-affb-3529fdc05d49\") " pod="openstack/placement-db-create-8bftw" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.347878 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2269b3b-bbaf-44bc-a77c-059195d12b86-operator-scripts\") pod \"keystone-5d21-account-create-update-4j4wt\" (UID: \"c2269b3b-bbaf-44bc-a77c-059195d12b86\") " pod="openstack/keystone-5d21-account-create-update-4j4wt" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.348549 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2269b3b-bbaf-44bc-a77c-059195d12b86-operator-scripts\") pod \"keystone-5d21-account-create-update-4j4wt\" (UID: \"c2269b3b-bbaf-44bc-a77c-059195d12b86\") " pod="openstack/keystone-5d21-account-create-update-4j4wt" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.383054 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c72d-account-create-update-hrdcp"] Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.384574 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c72d-account-create-update-hrdcp" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.387068 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.396478 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c72d-account-create-update-hrdcp"] Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.396714 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g8745" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.450716 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbe96df-78a6-449e-affb-3529fdc05d49-operator-scripts\") pod \"placement-db-create-8bftw\" (UID: \"fdbe96df-78a6-449e-affb-3529fdc05d49\") " pod="openstack/placement-db-create-8bftw" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.450839 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpbnv\" (UniqueName: \"kubernetes.io/projected/fdbe96df-78a6-449e-affb-3529fdc05d49-kube-api-access-zpbnv\") pod \"placement-db-create-8bftw\" (UID: \"fdbe96df-78a6-449e-affb-3529fdc05d49\") " pod="openstack/placement-db-create-8bftw" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.450868 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r7dr\" (UniqueName: \"kubernetes.io/projected/05e337d3-8903-40f9-84d2-bb1e0e7d4629-kube-api-access-6r7dr\") pod \"placement-c72d-account-create-update-hrdcp\" (UID: \"05e337d3-8903-40f9-84d2-bb1e0e7d4629\") " pod="openstack/placement-c72d-account-create-update-hrdcp" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.451049 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05e337d3-8903-40f9-84d2-bb1e0e7d4629-operator-scripts\") pod \"placement-c72d-account-create-update-hrdcp\" (UID: \"05e337d3-8903-40f9-84d2-bb1e0e7d4629\") " pod="openstack/placement-c72d-account-create-update-hrdcp" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.452536 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbe96df-78a6-449e-affb-3529fdc05d49-operator-scripts\") pod \"placement-db-create-8bftw\" (UID: \"fdbe96df-78a6-449e-affb-3529fdc05d49\") " pod="openstack/placement-db-create-8bftw" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.479668 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwc8s\" (UniqueName: \"kubernetes.io/projected/c2269b3b-bbaf-44bc-a77c-059195d12b86-kube-api-access-zwc8s\") pod \"keystone-5d21-account-create-update-4j4wt\" (UID: \"c2269b3b-bbaf-44bc-a77c-059195d12b86\") " pod="openstack/keystone-5d21-account-create-update-4j4wt" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.480362 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpbnv\" (UniqueName: \"kubernetes.io/projected/fdbe96df-78a6-449e-affb-3529fdc05d49-kube-api-access-zpbnv\") pod \"placement-db-create-8bftw\" (UID: \"fdbe96df-78a6-449e-affb-3529fdc05d49\") " pod="openstack/placement-db-create-8bftw" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.506957 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d21-account-create-update-4j4wt" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.552867 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05e337d3-8903-40f9-84d2-bb1e0e7d4629-operator-scripts\") pod \"placement-c72d-account-create-update-hrdcp\" (UID: \"05e337d3-8903-40f9-84d2-bb1e0e7d4629\") " pod="openstack/placement-c72d-account-create-update-hrdcp" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.553077 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r7dr\" (UniqueName: \"kubernetes.io/projected/05e337d3-8903-40f9-84d2-bb1e0e7d4629-kube-api-access-6r7dr\") pod \"placement-c72d-account-create-update-hrdcp\" (UID: \"05e337d3-8903-40f9-84d2-bb1e0e7d4629\") " pod="openstack/placement-c72d-account-create-update-hrdcp" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.554207 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05e337d3-8903-40f9-84d2-bb1e0e7d4629-operator-scripts\") pod \"placement-c72d-account-create-update-hrdcp\" (UID: \"05e337d3-8903-40f9-84d2-bb1e0e7d4629\") " pod="openstack/placement-c72d-account-create-update-hrdcp" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.572449 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r7dr\" (UniqueName: \"kubernetes.io/projected/05e337d3-8903-40f9-84d2-bb1e0e7d4629-kube-api-access-6r7dr\") pod \"placement-c72d-account-create-update-hrdcp\" (UID: \"05e337d3-8903-40f9-84d2-bb1e0e7d4629\") " pod="openstack/placement-c72d-account-create-update-hrdcp" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.629784 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8bftw" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.643477 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-sgdhb" podUID="c0b0baed-3140-4ac4-9d27-e8fc15c390c2" containerName="ovn-controller" probeResult="failure" output=< Feb 25 16:11:07 crc kubenswrapper[4937]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 25 16:11:07 crc kubenswrapper[4937]: > Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.713090 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c72d-account-create-update-hrdcp" Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.907231 4937 generic.go:334] "Generic (PLEG): container finished" podID="f87815ec-3a3b-4521-97ef-abf88231d48b" containerID="d7ccbc37d2e9e7845e22afd673f137c9bda281f4362f748a265d8c6f692ae8c8" exitCode=0 Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.907304 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lfjf9" event={"ID":"f87815ec-3a3b-4521-97ef-abf88231d48b","Type":"ContainerDied","Data":"d7ccbc37d2e9e7845e22afd673f137c9bda281f4362f748a265d8c6f692ae8c8"} Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.908893 4937 generic.go:334] "Generic (PLEG): container finished" podID="2530391c-1cbc-4c0c-ab27-bba9cfcc5149" containerID="171df702add08fe3c36371361badbad1544eaf0ec157059c87ceb8ac24fc2729" exitCode=0 Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.908955 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1246-account-create-update-c8jvn" event={"ID":"2530391c-1cbc-4c0c-ab27-bba9cfcc5149","Type":"ContainerDied","Data":"171df702add08fe3c36371361badbad1544eaf0ec157059c87ceb8ac24fc2729"} Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.909980 4937 generic.go:334] "Generic (PLEG): container finished" podID="c163075d-e1f7-4252-92aa-17b9bbfe336a" containerID="5a4ecea5e96581497771457c324bd235ddd5897d43e16f46c92a3ed611831fdd" exitCode=0 Feb 25 16:11:07 crc kubenswrapper[4937]: I0225 16:11:07.910395 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zbnk5" event={"ID":"c163075d-e1f7-4252-92aa-17b9bbfe336a","Type":"ContainerDied","Data":"5a4ecea5e96581497771457c324bd235ddd5897d43e16f46c92a3ed611831fdd"} Feb 25 16:11:07 crc kubenswrapper[4937]: E0225 16:11:07.911973 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified\\\"\"" pod="openstack/ovn-northd-0" podUID="8ce79682-d3ee-4afb-ba50-fdacc0fe6910" Feb 25 16:11:08 crc kubenswrapper[4937]: W0225 16:11:08.002180 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc04fbb9c_6249_4f8a_b67b_4bcb9418bfd3.slice/crio-67d409c4dc67f2dc9e156a7bf0c8b07e68a17358404a75ea6b40aec3ce0b3672 WatchSource:0}: Error finding container 67d409c4dc67f2dc9e156a7bf0c8b07e68a17358404a75ea6b40aec3ce0b3672: Status 404 returned error can't find the container with id 67d409c4dc67f2dc9e156a7bf0c8b07e68a17358404a75ea6b40aec3ce0b3672 Feb 25 16:11:08 crc kubenswrapper[4937]: I0225 16:11:08.008957 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-g8745"] Feb 25 16:11:08 crc kubenswrapper[4937]: I0225 16:11:08.116354 4937 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8] : Timed out while waiting for systemd to remove kubepods-besteffort-pod005a2e70_fc68_4d15_b3c6_61dcd4f4c1a8.slice" Feb 25 16:11:08 crc kubenswrapper[4937]: E0225 16:11:08.116420 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8] : Timed out while waiting for systemd to remove kubepods-besteffort-pod005a2e70_fc68_4d15_b3c6_61dcd4f4c1a8.slice" pod="openshift-infra/auto-csr-approver-29533930-xjhtn" podUID="005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8" Feb 25 16:11:08 crc kubenswrapper[4937]: I0225 16:11:08.136221 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d21-account-create-update-4j4wt"] Feb 25 16:11:08 crc kubenswrapper[4937]: W0225 16:11:08.141756 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdbe96df_78a6_449e_affb_3529fdc05d49.slice/crio-b0adf188ee475b5ae8abe6c3f72db35a9397c6f6fe78bbdc0dffd26a7cf840c4 WatchSource:0}: Error finding container b0adf188ee475b5ae8abe6c3f72db35a9397c6f6fe78bbdc0dffd26a7cf840c4: Status 404 returned error can't find the container with id b0adf188ee475b5ae8abe6c3f72db35a9397c6f6fe78bbdc0dffd26a7cf840c4 Feb 25 16:11:08 crc kubenswrapper[4937]: I0225 16:11:08.145115 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8bftw"] Feb 25 16:11:08 crc kubenswrapper[4937]: I0225 16:11:08.375917 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c72d-account-create-update-hrdcp"] Feb 25 16:11:08 crc kubenswrapper[4937]: W0225 16:11:08.821651 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05e337d3_8903_40f9_84d2_bb1e0e7d4629.slice/crio-0db539ddfb044ec5cf0977adb1a08d8ce5036608fbfbe2943d0bc5f05226a2d2 WatchSource:0}: Error finding container 0db539ddfb044ec5cf0977adb1a08d8ce5036608fbfbe2943d0bc5f05226a2d2: Status 404 returned error can't find the container with id 0db539ddfb044ec5cf0977adb1a08d8ce5036608fbfbe2943d0bc5f05226a2d2 Feb 25 16:11:08 crc kubenswrapper[4937]: I0225 16:11:08.921170 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8bftw" event={"ID":"fdbe96df-78a6-449e-affb-3529fdc05d49","Type":"ContainerStarted","Data":"b0adf188ee475b5ae8abe6c3f72db35a9397c6f6fe78bbdc0dffd26a7cf840c4"} Feb 25 16:11:08 crc kubenswrapper[4937]: I0225 16:11:08.924823 4937 generic.go:334] "Generic (PLEG): container finished" podID="c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3" containerID="29f9cdb7519e86fe9ba15e10efbd9aeddfd04710c8a4ea64818905fd87e80ea2" exitCode=0 Feb 25 16:11:08 crc kubenswrapper[4937]: I0225 16:11:08.924934 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g8745" event={"ID":"c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3","Type":"ContainerDied","Data":"29f9cdb7519e86fe9ba15e10efbd9aeddfd04710c8a4ea64818905fd87e80ea2"} Feb 25 16:11:08 crc kubenswrapper[4937]: I0225 16:11:08.924974 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g8745" event={"ID":"c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3","Type":"ContainerStarted","Data":"67d409c4dc67f2dc9e156a7bf0c8b07e68a17358404a75ea6b40aec3ce0b3672"} Feb 25 16:11:08 crc kubenswrapper[4937]: I0225 16:11:08.929011 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d21-account-create-update-4j4wt" event={"ID":"c2269b3b-bbaf-44bc-a77c-059195d12b86","Type":"ContainerStarted","Data":"3a3667f118fdd8700ad7e121855df2bc6d03cfbc36a9ad7d119ec76df344f903"} Feb 25 16:11:08 crc kubenswrapper[4937]: I0225 16:11:08.930962 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533930-xjhtn" Feb 25 16:11:08 crc kubenswrapper[4937]: I0225 16:11:08.931015 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c72d-account-create-update-hrdcp" event={"ID":"05e337d3-8903-40f9-84d2-bb1e0e7d4629","Type":"ContainerStarted","Data":"0db539ddfb044ec5cf0977adb1a08d8ce5036608fbfbe2943d0bc5f05226a2d2"} Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.427015 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1246-account-create-update-c8jvn" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.473269 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lfjf9" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.495578 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdfcp\" (UniqueName: \"kubernetes.io/projected/f87815ec-3a3b-4521-97ef-abf88231d48b-kube-api-access-sdfcp\") pod \"f87815ec-3a3b-4521-97ef-abf88231d48b\" (UID: \"f87815ec-3a3b-4521-97ef-abf88231d48b\") " Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.495673 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f87815ec-3a3b-4521-97ef-abf88231d48b-operator-scripts\") pod \"f87815ec-3a3b-4521-97ef-abf88231d48b\" (UID: \"f87815ec-3a3b-4521-97ef-abf88231d48b\") " Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.495734 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txwlm\" (UniqueName: \"kubernetes.io/projected/2530391c-1cbc-4c0c-ab27-bba9cfcc5149-kube-api-access-txwlm\") pod \"2530391c-1cbc-4c0c-ab27-bba9cfcc5149\" (UID: \"2530391c-1cbc-4c0c-ab27-bba9cfcc5149\") " Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.495939 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2530391c-1cbc-4c0c-ab27-bba9cfcc5149-operator-scripts\") pod \"2530391c-1cbc-4c0c-ab27-bba9cfcc5149\" (UID: \"2530391c-1cbc-4c0c-ab27-bba9cfcc5149\") " Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.498696 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2530391c-1cbc-4c0c-ab27-bba9cfcc5149-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2530391c-1cbc-4c0c-ab27-bba9cfcc5149" (UID: "2530391c-1cbc-4c0c-ab27-bba9cfcc5149"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.501070 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87815ec-3a3b-4521-97ef-abf88231d48b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f87815ec-3a3b-4521-97ef-abf88231d48b" (UID: "f87815ec-3a3b-4521-97ef-abf88231d48b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.508430 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zbnk5" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.511137 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2530391c-1cbc-4c0c-ab27-bba9cfcc5149-kube-api-access-txwlm" (OuterVolumeSpecName: "kube-api-access-txwlm") pod "2530391c-1cbc-4c0c-ab27-bba9cfcc5149" (UID: "2530391c-1cbc-4c0c-ab27-bba9cfcc5149"). InnerVolumeSpecName "kube-api-access-txwlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.511820 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f87815ec-3a3b-4521-97ef-abf88231d48b-kube-api-access-sdfcp" (OuterVolumeSpecName: "kube-api-access-sdfcp") pod "f87815ec-3a3b-4521-97ef-abf88231d48b" (UID: "f87815ec-3a3b-4521-97ef-abf88231d48b"). InnerVolumeSpecName "kube-api-access-sdfcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.597436 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c163075d-e1f7-4252-92aa-17b9bbfe336a-operator-scripts\") pod \"c163075d-e1f7-4252-92aa-17b9bbfe336a\" (UID: \"c163075d-e1f7-4252-92aa-17b9bbfe336a\") " Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.598343 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c163075d-e1f7-4252-92aa-17b9bbfe336a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c163075d-e1f7-4252-92aa-17b9bbfe336a" (UID: "c163075d-e1f7-4252-92aa-17b9bbfe336a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.598637 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-httb9\" (UniqueName: \"kubernetes.io/projected/c163075d-e1f7-4252-92aa-17b9bbfe336a-kube-api-access-httb9\") pod \"c163075d-e1f7-4252-92aa-17b9bbfe336a\" (UID: \"c163075d-e1f7-4252-92aa-17b9bbfe336a\") " Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.599135 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f87815ec-3a3b-4521-97ef-abf88231d48b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.599148 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txwlm\" (UniqueName: \"kubernetes.io/projected/2530391c-1cbc-4c0c-ab27-bba9cfcc5149-kube-api-access-txwlm\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.599158 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2530391c-1cbc-4c0c-ab27-bba9cfcc5149-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.599166 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c163075d-e1f7-4252-92aa-17b9bbfe336a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.599183 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdfcp\" (UniqueName: \"kubernetes.io/projected/f87815ec-3a3b-4521-97ef-abf88231d48b-kube-api-access-sdfcp\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.603071 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c163075d-e1f7-4252-92aa-17b9bbfe336a-kube-api-access-httb9" (OuterVolumeSpecName: "kube-api-access-httb9") pod "c163075d-e1f7-4252-92aa-17b9bbfe336a" (UID: "c163075d-e1f7-4252-92aa-17b9bbfe336a"). InnerVolumeSpecName "kube-api-access-httb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.700975 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-httb9\" (UniqueName: \"kubernetes.io/projected/c163075d-e1f7-4252-92aa-17b9bbfe336a-kube-api-access-httb9\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.942362 4937 generic.go:334] "Generic (PLEG): container finished" podID="c2269b3b-bbaf-44bc-a77c-059195d12b86" containerID="4d42b95fc6b1e699d92332668e701f0555d27ce7f429fabb5e19e2bedb7254fc" exitCode=0 Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.942414 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d21-account-create-update-4j4wt" event={"ID":"c2269b3b-bbaf-44bc-a77c-059195d12b86","Type":"ContainerDied","Data":"4d42b95fc6b1e699d92332668e701f0555d27ce7f429fabb5e19e2bedb7254fc"} Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.946829 4937 generic.go:334] "Generic (PLEG): container finished" podID="05e337d3-8903-40f9-84d2-bb1e0e7d4629" containerID="19c82d8b77bd9eee7e468b03627e81ea09173f9635cd3b5c970210c79edd727d" exitCode=0 Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.946911 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c72d-account-create-update-hrdcp" event={"ID":"05e337d3-8903-40f9-84d2-bb1e0e7d4629","Type":"ContainerDied","Data":"19c82d8b77bd9eee7e468b03627e81ea09173f9635cd3b5c970210c79edd727d"} Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.948934 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lfjf9" event={"ID":"f87815ec-3a3b-4521-97ef-abf88231d48b","Type":"ContainerDied","Data":"af1d88d3dbe35d11921d4585943f4b9ad1a513aada937bd07d1b2c812df57b42"} Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.948965 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af1d88d3dbe35d11921d4585943f4b9ad1a513aada937bd07d1b2c812df57b42" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.948940 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lfjf9" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.951625 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d773f4d2-bec3-4379-a7a2-29975a18c85b","Type":"ContainerStarted","Data":"8d7e53a6a97258904668c130e0d7a9926328a5d2358904d947104d98af307e5a"} Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.952830 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.955461 4937 generic.go:334] "Generic (PLEG): container finished" podID="b9ebad40-444e-4250-85cb-2a154282cdf9" containerID="7d15cf71941dacdd51d4d3f984cb980362aba44baf3e3b14e00f057c6dd681fc" exitCode=0 Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.955512 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9ebad40-444e-4250-85cb-2a154282cdf9","Type":"ContainerDied","Data":"7d15cf71941dacdd51d4d3f984cb980362aba44baf3e3b14e00f057c6dd681fc"} Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.955798 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.963628 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zbnk5" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.963625 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zbnk5" event={"ID":"c163075d-e1f7-4252-92aa-17b9bbfe336a","Type":"ContainerDied","Data":"2770a7df323fd0f135ce59045b58fd507fea558be50afe2341a3ca0bc96676eb"} Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.963747 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2770a7df323fd0f135ce59045b58fd507fea558be50afe2341a3ca0bc96676eb" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.965543 4937 generic.go:334] "Generic (PLEG): container finished" podID="fdbe96df-78a6-449e-affb-3529fdc05d49" containerID="237083e3c1d59c300baa7e892d31a3d9cfcae08915b0c62639d276547f4ca9e0" exitCode=0 Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.965606 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8bftw" event={"ID":"fdbe96df-78a6-449e-affb-3529fdc05d49","Type":"ContainerDied","Data":"237083e3c1d59c300baa7e892d31a3d9cfcae08915b0c62639d276547f4ca9e0"} Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.967355 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1246-account-create-update-c8jvn" event={"ID":"2530391c-1cbc-4c0c-ab27-bba9cfcc5149","Type":"ContainerDied","Data":"bf86df82772e405cf0e6a2de76ba0b7f401e50abb7cc408cf60e063b5e84dec4"} Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.967385 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf86df82772e405cf0e6a2de76ba0b7f401e50abb7cc408cf60e063b5e84dec4" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.967426 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1246-account-create-update-c8jvn" Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.969041 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e19ac505-41d9-4d1d-b75a-0c88e26960c8","Type":"ContainerStarted","Data":"0f02def2b2987794ca53f24f3e1c837b05c0fb749e7d29a9c4bbf2f2caeb3500"} Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.980017 4937 generic.go:334] "Generic (PLEG): container finished" podID="de5b4144-33d4-4860-9872-8826c78490a7" containerID="82dee4b670df39dc191f5c519f9747dc8a2893b9682ea4e53c75a02199de7f0c" exitCode=0 Feb 25 16:11:09 crc kubenswrapper[4937]: I0225 16:11:09.980738 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"de5b4144-33d4-4860-9872-8826c78490a7","Type":"ContainerDied","Data":"82dee4b670df39dc191f5c519f9747dc8a2893b9682ea4e53c75a02199de7f0c"} Feb 25 16:11:10 crc kubenswrapper[4937]: I0225 16:11:10.072092 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=29.429512633 podStartE2EDuration="1m12.072074369s" podCreationTimestamp="2026-02-25 16:09:58 +0000 UTC" firstStartedPulling="2026-02-25 16:10:23.572474663 +0000 UTC m=+1474.585866553" lastFinishedPulling="2026-02-25 16:11:06.215036399 +0000 UTC m=+1517.228428289" observedRunningTime="2026-02-25 16:11:10.016738252 +0000 UTC m=+1521.030130142" watchObservedRunningTime="2026-02-25 16:11:10.072074369 +0000 UTC m=+1521.085466249" Feb 25 16:11:10 crc kubenswrapper[4937]: I0225 16:11:10.382631 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g8745" Feb 25 16:11:10 crc kubenswrapper[4937]: I0225 16:11:10.528287 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3-operator-scripts\") pod \"c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3\" (UID: \"c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3\") " Feb 25 16:11:10 crc kubenswrapper[4937]: I0225 16:11:10.528736 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgtgp\" (UniqueName: \"kubernetes.io/projected/c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3-kube-api-access-zgtgp\") pod \"c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3\" (UID: \"c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3\") " Feb 25 16:11:10 crc kubenswrapper[4937]: I0225 16:11:10.528778 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3" (UID: "c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:10 crc kubenswrapper[4937]: I0225 16:11:10.529471 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:10 crc kubenswrapper[4937]: I0225 16:11:10.533814 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3-kube-api-access-zgtgp" (OuterVolumeSpecName: "kube-api-access-zgtgp") pod "c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3" (UID: "c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3"). InnerVolumeSpecName "kube-api-access-zgtgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:10 crc kubenswrapper[4937]: I0225 16:11:10.631951 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgtgp\" (UniqueName: \"kubernetes.io/projected/c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3-kube-api-access-zgtgp\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:10 crc kubenswrapper[4937]: I0225 16:11:10.996798 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"de5b4144-33d4-4860-9872-8826c78490a7","Type":"ContainerStarted","Data":"62a80333ddfc88f47488d577f59b7296f624b0810fae1400d4e957b9531f0159"} Feb 25 16:11:10 crc kubenswrapper[4937]: I0225 16:11:10.997138 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:11:10 crc kubenswrapper[4937]: I0225 16:11:10.999847 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6nkbm" event={"ID":"0a0f0530-95e1-4231-9933-bedb49b72a88","Type":"ContainerStarted","Data":"563c2ee421211750456aca87456cfdbdbbad8624f68731303309a98c77edb56f"} Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.003343 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9ebad40-444e-4250-85cb-2a154282cdf9","Type":"ContainerStarted","Data":"3501318ab23e809f44e4fe03fbb027a573e57d253846d0261206afe1f5c473ba"} Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.003693 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.006438 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g8745" event={"ID":"c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3","Type":"ContainerDied","Data":"67d409c4dc67f2dc9e156a7bf0c8b07e68a17358404a75ea6b40aec3ce0b3672"} Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.006587 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67d409c4dc67f2dc9e156a7bf0c8b07e68a17358404a75ea6b40aec3ce0b3672" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.006646 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g8745" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.014442 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="08382e6d-e8e5-4656-a524-26c8269114fd" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.055339 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371956.799456 podStartE2EDuration="1m20.055319295s" podCreationTimestamp="2026-02-25 16:09:51 +0000 UTC" firstStartedPulling="2026-02-25 16:09:53.857990632 +0000 UTC m=+1444.871382522" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:11.039869938 +0000 UTC m=+1522.053261818" watchObservedRunningTime="2026-02-25 16:11:11.055319295 +0000 UTC m=+1522.068711175" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.095294 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.974836186 podStartE2EDuration="1m20.095274257s" podCreationTimestamp="2026-02-25 16:09:51 +0000 UTC" firstStartedPulling="2026-02-25 16:09:53.714449544 +0000 UTC m=+1444.727841444" lastFinishedPulling="2026-02-25 16:10:32.834887625 +0000 UTC m=+1483.848279515" observedRunningTime="2026-02-25 16:11:11.085687526 +0000 UTC m=+1522.099079426" watchObservedRunningTime="2026-02-25 16:11:11.095274257 +0000 UTC m=+1522.108666167" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.106254 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-6nkbm" podStartSLOduration=2.810210879 podStartE2EDuration="32.106236041s" podCreationTimestamp="2026-02-25 16:10:39 +0000 UTC" firstStartedPulling="2026-02-25 16:10:40.770715243 +0000 UTC m=+1491.784107143" lastFinishedPulling="2026-02-25 16:11:10.066740415 +0000 UTC m=+1521.080132305" observedRunningTime="2026-02-25 16:11:11.099543584 +0000 UTC m=+1522.112935494" watchObservedRunningTime="2026-02-25 16:11:11.106236041 +0000 UTC m=+1522.119627931" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.250446 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:11:11 crc kubenswrapper[4937]: E0225 16:11:11.250676 4937 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 16:11:11 crc kubenswrapper[4937]: E0225 16:11:11.250711 4937 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 16:11:11 crc kubenswrapper[4937]: E0225 16:11:11.250780 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift podName:48d22af0-5579-46fb-889d-fd34e46d26e9 nodeName:}" failed. No retries permitted until 2026-02-25 16:11:43.250760304 +0000 UTC m=+1554.264152194 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift") pod "swift-storage-0" (UID: "48d22af0-5579-46fb-889d-fd34e46d26e9") : configmap "swift-ring-files" not found Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.503383 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8bftw" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.554166 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbe96df-78a6-449e-affb-3529fdc05d49-operator-scripts\") pod \"fdbe96df-78a6-449e-affb-3529fdc05d49\" (UID: \"fdbe96df-78a6-449e-affb-3529fdc05d49\") " Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.554402 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpbnv\" (UniqueName: \"kubernetes.io/projected/fdbe96df-78a6-449e-affb-3529fdc05d49-kube-api-access-zpbnv\") pod \"fdbe96df-78a6-449e-affb-3529fdc05d49\" (UID: \"fdbe96df-78a6-449e-affb-3529fdc05d49\") " Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.556439 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdbe96df-78a6-449e-affb-3529fdc05d49-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fdbe96df-78a6-449e-affb-3529fdc05d49" (UID: "fdbe96df-78a6-449e-affb-3529fdc05d49"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.571591 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdbe96df-78a6-449e-affb-3529fdc05d49-kube-api-access-zpbnv" (OuterVolumeSpecName: "kube-api-access-zpbnv") pod "fdbe96df-78a6-449e-affb-3529fdc05d49" (UID: "fdbe96df-78a6-449e-affb-3529fdc05d49"). InnerVolumeSpecName "kube-api-access-zpbnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.586741 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-w8hgr"] Feb 25 16:11:11 crc kubenswrapper[4937]: E0225 16:11:11.587147 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2530391c-1cbc-4c0c-ab27-bba9cfcc5149" containerName="mariadb-account-create-update" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.587164 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2530391c-1cbc-4c0c-ab27-bba9cfcc5149" containerName="mariadb-account-create-update" Feb 25 16:11:11 crc kubenswrapper[4937]: E0225 16:11:11.587176 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87815ec-3a3b-4521-97ef-abf88231d48b" containerName="mariadb-account-create-update" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.587183 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87815ec-3a3b-4521-97ef-abf88231d48b" containerName="mariadb-account-create-update" Feb 25 16:11:11 crc kubenswrapper[4937]: E0225 16:11:11.587195 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3" containerName="mariadb-database-create" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.587202 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3" containerName="mariadb-database-create" Feb 25 16:11:11 crc kubenswrapper[4937]: E0225 16:11:11.587233 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdbe96df-78a6-449e-affb-3529fdc05d49" containerName="mariadb-database-create" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.587238 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdbe96df-78a6-449e-affb-3529fdc05d49" containerName="mariadb-database-create" Feb 25 16:11:11 crc kubenswrapper[4937]: E0225 16:11:11.587250 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c163075d-e1f7-4252-92aa-17b9bbfe336a" containerName="mariadb-database-create" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.587256 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="c163075d-e1f7-4252-92aa-17b9bbfe336a" containerName="mariadb-database-create" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.587438 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="c163075d-e1f7-4252-92aa-17b9bbfe336a" containerName="mariadb-database-create" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.587466 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3" containerName="mariadb-database-create" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.587496 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f87815ec-3a3b-4521-97ef-abf88231d48b" containerName="mariadb-account-create-update" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.587518 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="2530391c-1cbc-4c0c-ab27-bba9cfcc5149" containerName="mariadb-account-create-update" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.587533 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdbe96df-78a6-449e-affb-3529fdc05d49" containerName="mariadb-database-create" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.588129 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.591168 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.591318 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-86t29" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.611014 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-w8hgr"] Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.656809 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-config-data\") pod \"glance-db-sync-w8hgr\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.656878 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll2cv\" (UniqueName: \"kubernetes.io/projected/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-kube-api-access-ll2cv\") pod \"glance-db-sync-w8hgr\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.656942 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-combined-ca-bundle\") pod \"glance-db-sync-w8hgr\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.656960 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-db-sync-config-data\") pod \"glance-db-sync-w8hgr\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.657067 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpbnv\" (UniqueName: \"kubernetes.io/projected/fdbe96df-78a6-449e-affb-3529fdc05d49-kube-api-access-zpbnv\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.657078 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbe96df-78a6-449e-affb-3529fdc05d49-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.660356 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d21-account-create-update-4j4wt" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.677137 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c72d-account-create-update-hrdcp" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.758434 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwc8s\" (UniqueName: \"kubernetes.io/projected/c2269b3b-bbaf-44bc-a77c-059195d12b86-kube-api-access-zwc8s\") pod \"c2269b3b-bbaf-44bc-a77c-059195d12b86\" (UID: \"c2269b3b-bbaf-44bc-a77c-059195d12b86\") " Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.758504 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05e337d3-8903-40f9-84d2-bb1e0e7d4629-operator-scripts\") pod \"05e337d3-8903-40f9-84d2-bb1e0e7d4629\" (UID: \"05e337d3-8903-40f9-84d2-bb1e0e7d4629\") " Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.758736 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r7dr\" (UniqueName: \"kubernetes.io/projected/05e337d3-8903-40f9-84d2-bb1e0e7d4629-kube-api-access-6r7dr\") pod \"05e337d3-8903-40f9-84d2-bb1e0e7d4629\" (UID: \"05e337d3-8903-40f9-84d2-bb1e0e7d4629\") " Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.758758 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2269b3b-bbaf-44bc-a77c-059195d12b86-operator-scripts\") pod \"c2269b3b-bbaf-44bc-a77c-059195d12b86\" (UID: \"c2269b3b-bbaf-44bc-a77c-059195d12b86\") " Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.759573 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2269b3b-bbaf-44bc-a77c-059195d12b86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2269b3b-bbaf-44bc-a77c-059195d12b86" (UID: "c2269b3b-bbaf-44bc-a77c-059195d12b86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.759784 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-config-data\") pod \"glance-db-sync-w8hgr\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.759959 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll2cv\" (UniqueName: \"kubernetes.io/projected/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-kube-api-access-ll2cv\") pod \"glance-db-sync-w8hgr\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.760116 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-combined-ca-bundle\") pod \"glance-db-sync-w8hgr\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.760236 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-db-sync-config-data\") pod \"glance-db-sync-w8hgr\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.760357 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05e337d3-8903-40f9-84d2-bb1e0e7d4629-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "05e337d3-8903-40f9-84d2-bb1e0e7d4629" (UID: "05e337d3-8903-40f9-84d2-bb1e0e7d4629"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.760680 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05e337d3-8903-40f9-84d2-bb1e0e7d4629-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.760697 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2269b3b-bbaf-44bc-a77c-059195d12b86-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.762626 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e337d3-8903-40f9-84d2-bb1e0e7d4629-kube-api-access-6r7dr" (OuterVolumeSpecName: "kube-api-access-6r7dr") pod "05e337d3-8903-40f9-84d2-bb1e0e7d4629" (UID: "05e337d3-8903-40f9-84d2-bb1e0e7d4629"). InnerVolumeSpecName "kube-api-access-6r7dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.763350 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-db-sync-config-data\") pod \"glance-db-sync-w8hgr\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.763447 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-config-data\") pod \"glance-db-sync-w8hgr\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.765705 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-combined-ca-bundle\") pod \"glance-db-sync-w8hgr\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.779882 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll2cv\" (UniqueName: \"kubernetes.io/projected/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-kube-api-access-ll2cv\") pod \"glance-db-sync-w8hgr\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.782642 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2269b3b-bbaf-44bc-a77c-059195d12b86-kube-api-access-zwc8s" (OuterVolumeSpecName: "kube-api-access-zwc8s") pod "c2269b3b-bbaf-44bc-a77c-059195d12b86" (UID: "c2269b3b-bbaf-44bc-a77c-059195d12b86"). InnerVolumeSpecName "kube-api-access-zwc8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.862631 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r7dr\" (UniqueName: \"kubernetes.io/projected/05e337d3-8903-40f9-84d2-bb1e0e7d4629-kube-api-access-6r7dr\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.862677 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwc8s\" (UniqueName: \"kubernetes.io/projected/c2269b3b-bbaf-44bc-a77c-059195d12b86-kube-api-access-zwc8s\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:11 crc kubenswrapper[4937]: I0225 16:11:11.971137 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.024588 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8bftw" event={"ID":"fdbe96df-78a6-449e-affb-3529fdc05d49","Type":"ContainerDied","Data":"b0adf188ee475b5ae8abe6c3f72db35a9397c6f6fe78bbdc0dffd26a7cf840c4"} Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.024874 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0adf188ee475b5ae8abe6c3f72db35a9397c6f6fe78bbdc0dffd26a7cf840c4" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.024655 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8bftw" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.037988 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d21-account-create-update-4j4wt" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.038218 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d21-account-create-update-4j4wt" event={"ID":"c2269b3b-bbaf-44bc-a77c-059195d12b86","Type":"ContainerDied","Data":"3a3667f118fdd8700ad7e121855df2bc6d03cfbc36a9ad7d119ec76df344f903"} Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.038266 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a3667f118fdd8700ad7e121855df2bc6d03cfbc36a9ad7d119ec76df344f903" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.041016 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c72d-account-create-update-hrdcp" event={"ID":"05e337d3-8903-40f9-84d2-bb1e0e7d4629","Type":"ContainerDied","Data":"0db539ddfb044ec5cf0977adb1a08d8ce5036608fbfbe2943d0bc5f05226a2d2"} Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.041073 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0db539ddfb044ec5cf0977adb1a08d8ce5036608fbfbe2943d0bc5f05226a2d2" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.041130 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c72d-account-create-update-hrdcp" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.433728 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-sgdhb" podUID="c0b0baed-3140-4ac4-9d27-e8fc15c390c2" containerName="ovn-controller" probeResult="failure" output=< Feb 25 16:11:12 crc kubenswrapper[4937]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 25 16:11:12 crc kubenswrapper[4937]: > Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.529893 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.541767 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-w8hgr"] Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.544116 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4rsxl" Feb 25 16:11:12 crc kubenswrapper[4937]: W0225 16:11:12.551473 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d70e0ad_27d8_4998_988f_1b5e2c7b8a7d.slice/crio-50315280550d3775f6167e28a01ad520521a45388978c5c68a446abbe8b4a515 WatchSource:0}: Error finding container 50315280550d3775f6167e28a01ad520521a45388978c5c68a446abbe8b4a515: Status 404 returned error can't find the container with id 50315280550d3775f6167e28a01ad520521a45388978c5c68a446abbe8b4a515 Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.771248 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sgdhb-config-w878x"] Feb 25 16:11:12 crc kubenswrapper[4937]: E0225 16:11:12.771703 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e337d3-8903-40f9-84d2-bb1e0e7d4629" containerName="mariadb-account-create-update" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.771725 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e337d3-8903-40f9-84d2-bb1e0e7d4629" containerName="mariadb-account-create-update" Feb 25 16:11:12 crc kubenswrapper[4937]: E0225 16:11:12.771752 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2269b3b-bbaf-44bc-a77c-059195d12b86" containerName="mariadb-account-create-update" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.771761 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2269b3b-bbaf-44bc-a77c-059195d12b86" containerName="mariadb-account-create-update" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.771956 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2269b3b-bbaf-44bc-a77c-059195d12b86" containerName="mariadb-account-create-update" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.771978 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e337d3-8903-40f9-84d2-bb1e0e7d4629" containerName="mariadb-account-create-update" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.772633 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.775099 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.795323 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sgdhb-config-w878x"] Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.893219 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-scripts\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.893319 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-additional-scripts\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.893365 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-run\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.893476 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-run-ovn\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.893544 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-log-ovn\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.893580 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzddg\" (UniqueName: \"kubernetes.io/projected/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-kube-api-access-kzddg\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.994535 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-run-ovn\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.994579 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-log-ovn\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.994612 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzddg\" (UniqueName: \"kubernetes.io/projected/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-kube-api-access-kzddg\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.994650 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-scripts\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.994702 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-additional-scripts\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.994736 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-run\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.995019 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-run\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.995024 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-run-ovn\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.995195 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-log-ovn\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.995604 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-additional-scripts\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:12 crc kubenswrapper[4937]: I0225 16:11:12.996970 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-scripts\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:13 crc kubenswrapper[4937]: I0225 16:11:13.017618 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzddg\" (UniqueName: \"kubernetes.io/projected/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-kube-api-access-kzddg\") pod \"ovn-controller-sgdhb-config-w878x\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:13 crc kubenswrapper[4937]: I0225 16:11:13.048074 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w8hgr" event={"ID":"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d","Type":"ContainerStarted","Data":"50315280550d3775f6167e28a01ad520521a45388978c5c68a446abbe8b4a515"} Feb 25 16:11:13 crc kubenswrapper[4937]: I0225 16:11:13.092348 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:13 crc kubenswrapper[4937]: I0225 16:11:13.623505 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sgdhb-config-w878x"] Feb 25 16:11:14 crc kubenswrapper[4937]: I0225 16:11:14.057414 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sgdhb-config-w878x" event={"ID":"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce","Type":"ContainerStarted","Data":"4757fbd259c99a2f9f1412b84438fd9fade0f8f5a039a702d6f0c5dac11e2cc9"} Feb 25 16:11:14 crc kubenswrapper[4937]: I0225 16:11:14.057910 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sgdhb-config-w878x" event={"ID":"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce","Type":"ContainerStarted","Data":"23c53e182513a059f3236db200efe46d7f2b7a4b6cfedbd39ea9d7e8343fc5af"} Feb 25 16:11:14 crc kubenswrapper[4937]: I0225 16:11:14.074265 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-sgdhb-config-w878x" podStartSLOduration=2.074244597 podStartE2EDuration="2.074244597s" podCreationTimestamp="2026-02-25 16:11:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:14.071064348 +0000 UTC m=+1525.084456238" watchObservedRunningTime="2026-02-25 16:11:14.074244597 +0000 UTC m=+1525.087636487" Feb 25 16:11:14 crc kubenswrapper[4937]: I0225 16:11:14.736344 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-lfjf9"] Feb 25 16:11:14 crc kubenswrapper[4937]: I0225 16:11:14.755100 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-lfjf9"] Feb 25 16:11:15 crc kubenswrapper[4937]: I0225 16:11:15.067348 4937 generic.go:334] "Generic (PLEG): container finished" podID="250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce" containerID="4757fbd259c99a2f9f1412b84438fd9fade0f8f5a039a702d6f0c5dac11e2cc9" exitCode=0 Feb 25 16:11:15 crc kubenswrapper[4937]: I0225 16:11:15.067399 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sgdhb-config-w878x" event={"ID":"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce","Type":"ContainerDied","Data":"4757fbd259c99a2f9f1412b84438fd9fade0f8f5a039a702d6f0c5dac11e2cc9"} Feb 25 16:11:15 crc kubenswrapper[4937]: I0225 16:11:15.380134 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f87815ec-3a3b-4521-97ef-abf88231d48b" path="/var/lib/kubelet/pods/f87815ec-3a3b-4521-97ef-abf88231d48b/volumes" Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.076919 4937 generic.go:334] "Generic (PLEG): container finished" podID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerID="0f02def2b2987794ca53f24f3e1c837b05c0fb749e7d29a9c4bbf2f2caeb3500" exitCode=0 Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.077003 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e19ac505-41d9-4d1d-b75a-0c88e26960c8","Type":"ContainerDied","Data":"0f02def2b2987794ca53f24f3e1c837b05c0fb749e7d29a9c4bbf2f2caeb3500"} Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.415192 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.454286 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-additional-scripts\") pod \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.454702 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce" (UID: "250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.454772 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-run\") pod \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.454835 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-run" (OuterVolumeSpecName: "var-run") pod "250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce" (UID: "250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.455020 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-scripts\") pod \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.456308 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-scripts" (OuterVolumeSpecName: "scripts") pod "250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce" (UID: "250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.456371 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-run-ovn\") pod \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.456407 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-log-ovn\") pod \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.456438 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce" (UID: "250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.456453 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce" (UID: "250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.456532 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzddg\" (UniqueName: \"kubernetes.io/projected/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-kube-api-access-kzddg\") pod \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\" (UID: \"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce\") " Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.458505 4937 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.458564 4937 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-run\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.458676 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.458687 4937 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.458736 4937 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.464395 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-kube-api-access-kzddg" (OuterVolumeSpecName: "kube-api-access-kzddg") pod "250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce" (UID: "250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce"). InnerVolumeSpecName "kube-api-access-kzddg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:16 crc kubenswrapper[4937]: I0225 16:11:16.561393 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzddg\" (UniqueName: \"kubernetes.io/projected/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce-kube-api-access-kzddg\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.095538 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sgdhb-config-w878x" event={"ID":"250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce","Type":"ContainerDied","Data":"23c53e182513a059f3236db200efe46d7f2b7a4b6cfedbd39ea9d7e8343fc5af"} Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.095575 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sgdhb-config-w878x" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.095604 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23c53e182513a059f3236db200efe46d7f2b7a4b6cfedbd39ea9d7e8343fc5af" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.165379 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-sgdhb-config-w878x"] Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.181796 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-sgdhb-config-w878x"] Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.383722 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce" path="/var/lib/kubelet/pods/250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce/volumes" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.390379 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sgdhb-config-5pqxn"] Feb 25 16:11:17 crc kubenswrapper[4937]: E0225 16:11:17.390797 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce" containerName="ovn-config" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.390817 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce" containerName="ovn-config" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.390991 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="250b07b0-67c7-4a5d-8cc0-c129a3f4a2ce" containerName="ovn-config" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.391602 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.394470 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.409445 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sgdhb-config-5pqxn"] Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.446792 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-sgdhb" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.490125 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-run-ovn\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.490565 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s24q2\" (UniqueName: \"kubernetes.io/projected/79be1a43-3786-4cf6-8127-016cb865312c-kube-api-access-s24q2\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.490669 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79be1a43-3786-4cf6-8127-016cb865312c-additional-scripts\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.490705 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-log-ovn\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.490720 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79be1a43-3786-4cf6-8127-016cb865312c-scripts\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.490874 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-run\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.592107 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-run-ovn\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.592183 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s24q2\" (UniqueName: \"kubernetes.io/projected/79be1a43-3786-4cf6-8127-016cb865312c-kube-api-access-s24q2\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.592237 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79be1a43-3786-4cf6-8127-016cb865312c-additional-scripts\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.592265 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-log-ovn\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.592280 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79be1a43-3786-4cf6-8127-016cb865312c-scripts\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.592346 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-run\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.592446 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-run-ovn\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.592452 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-run\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.593267 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79be1a43-3786-4cf6-8127-016cb865312c-additional-scripts\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.594258 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79be1a43-3786-4cf6-8127-016cb865312c-scripts\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.594365 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-log-ovn\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.609888 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s24q2\" (UniqueName: \"kubernetes.io/projected/79be1a43-3786-4cf6-8127-016cb865312c-kube-api-access-s24q2\") pod \"ovn-controller-sgdhb-config-5pqxn\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:17 crc kubenswrapper[4937]: I0225 16:11:17.708812 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:18 crc kubenswrapper[4937]: I0225 16:11:18.192392 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sgdhb-config-5pqxn"] Feb 25 16:11:18 crc kubenswrapper[4937]: W0225 16:11:18.209637 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79be1a43_3786_4cf6_8127_016cb865312c.slice/crio-7bab99daa9b795d3a3cf832863dace83e6cfe05a6e0bf280fd482cb8c8f6de72 WatchSource:0}: Error finding container 7bab99daa9b795d3a3cf832863dace83e6cfe05a6e0bf280fd482cb8c8f6de72: Status 404 returned error can't find the container with id 7bab99daa9b795d3a3cf832863dace83e6cfe05a6e0bf280fd482cb8c8f6de72 Feb 25 16:11:19 crc kubenswrapper[4937]: I0225 16:11:19.121906 4937 generic.go:334] "Generic (PLEG): container finished" podID="79be1a43-3786-4cf6-8127-016cb865312c" containerID="0245edefc1fec8749d4735f0ca12e19c045029c5a10ad5e50e21bb8a1e76d09d" exitCode=0 Feb 25 16:11:19 crc kubenswrapper[4937]: I0225 16:11:19.122256 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sgdhb-config-5pqxn" event={"ID":"79be1a43-3786-4cf6-8127-016cb865312c","Type":"ContainerDied","Data":"0245edefc1fec8749d4735f0ca12e19c045029c5a10ad5e50e21bb8a1e76d09d"} Feb 25 16:11:19 crc kubenswrapper[4937]: I0225 16:11:19.122285 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sgdhb-config-5pqxn" event={"ID":"79be1a43-3786-4cf6-8127-016cb865312c","Type":"ContainerStarted","Data":"7bab99daa9b795d3a3cf832863dace83e6cfe05a6e0bf280fd482cb8c8f6de72"} Feb 25 16:11:19 crc kubenswrapper[4937]: I0225 16:11:19.124971 4937 generic.go:334] "Generic (PLEG): container finished" podID="0a0f0530-95e1-4231-9933-bedb49b72a88" containerID="563c2ee421211750456aca87456cfdbdbbad8624f68731303309a98c77edb56f" exitCode=0 Feb 25 16:11:19 crc kubenswrapper[4937]: I0225 16:11:19.125050 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6nkbm" event={"ID":"0a0f0530-95e1-4231-9933-bedb49b72a88","Type":"ContainerDied","Data":"563c2ee421211750456aca87456cfdbdbbad8624f68731303309a98c77edb56f"} Feb 25 16:11:19 crc kubenswrapper[4937]: I0225 16:11:19.836165 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gbxlp"] Feb 25 16:11:19 crc kubenswrapper[4937]: I0225 16:11:19.838171 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gbxlp" Feb 25 16:11:19 crc kubenswrapper[4937]: I0225 16:11:19.841235 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 25 16:11:19 crc kubenswrapper[4937]: I0225 16:11:19.845008 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gbxlp"] Feb 25 16:11:19 crc kubenswrapper[4937]: I0225 16:11:19.938814 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-682k8\" (UniqueName: \"kubernetes.io/projected/231f1566-c91c-47f6-9ef5-5a9fbc5b0c57-kube-api-access-682k8\") pod \"root-account-create-update-gbxlp\" (UID: \"231f1566-c91c-47f6-9ef5-5a9fbc5b0c57\") " pod="openstack/root-account-create-update-gbxlp" Feb 25 16:11:19 crc kubenswrapper[4937]: I0225 16:11:19.939236 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231f1566-c91c-47f6-9ef5-5a9fbc5b0c57-operator-scripts\") pod \"root-account-create-update-gbxlp\" (UID: \"231f1566-c91c-47f6-9ef5-5a9fbc5b0c57\") " pod="openstack/root-account-create-update-gbxlp" Feb 25 16:11:20 crc kubenswrapper[4937]: I0225 16:11:20.041448 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-682k8\" (UniqueName: \"kubernetes.io/projected/231f1566-c91c-47f6-9ef5-5a9fbc5b0c57-kube-api-access-682k8\") pod \"root-account-create-update-gbxlp\" (UID: \"231f1566-c91c-47f6-9ef5-5a9fbc5b0c57\") " pod="openstack/root-account-create-update-gbxlp" Feb 25 16:11:20 crc kubenswrapper[4937]: I0225 16:11:20.042027 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231f1566-c91c-47f6-9ef5-5a9fbc5b0c57-operator-scripts\") pod \"root-account-create-update-gbxlp\" (UID: \"231f1566-c91c-47f6-9ef5-5a9fbc5b0c57\") " pod="openstack/root-account-create-update-gbxlp" Feb 25 16:11:20 crc kubenswrapper[4937]: I0225 16:11:20.046207 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231f1566-c91c-47f6-9ef5-5a9fbc5b0c57-operator-scripts\") pod \"root-account-create-update-gbxlp\" (UID: \"231f1566-c91c-47f6-9ef5-5a9fbc5b0c57\") " pod="openstack/root-account-create-update-gbxlp" Feb 25 16:11:20 crc kubenswrapper[4937]: I0225 16:11:20.072078 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-682k8\" (UniqueName: \"kubernetes.io/projected/231f1566-c91c-47f6-9ef5-5a9fbc5b0c57-kube-api-access-682k8\") pod \"root-account-create-update-gbxlp\" (UID: \"231f1566-c91c-47f6-9ef5-5a9fbc5b0c57\") " pod="openstack/root-account-create-update-gbxlp" Feb 25 16:11:20 crc kubenswrapper[4937]: I0225 16:11:20.162177 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gbxlp" Feb 25 16:11:21 crc kubenswrapper[4937]: I0225 16:11:21.014599 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="08382e6d-e8e5-4656-a524-26c8269114fd" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 25 16:11:22 crc kubenswrapper[4937]: I0225 16:11:22.992371 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.316654 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.348934 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-9cgvr"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.350399 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-9cgvr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.384620 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-9cgvr"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.468954 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-zg54r"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.470552 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zg54r" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.495660 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zg54r"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.545790 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psqgd\" (UniqueName: \"kubernetes.io/projected/c1373796-a25a-406b-a417-a26ff42bbce4-kube-api-access-psqgd\") pod \"cloudkitty-db-create-9cgvr\" (UID: \"c1373796-a25a-406b-a417-a26ff42bbce4\") " pod="openstack/cloudkitty-db-create-9cgvr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.545851 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1373796-a25a-406b-a417-a26ff42bbce4-operator-scripts\") pod \"cloudkitty-db-create-9cgvr\" (UID: \"c1373796-a25a-406b-a417-a26ff42bbce4\") " pod="openstack/cloudkitty-db-create-9cgvr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.560652 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f8fc-account-create-update-drksr"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.562105 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f8fc-account-create-update-drksr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.564160 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.569943 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f8fc-account-create-update-drksr"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.647521 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfvqx\" (UniqueName: \"kubernetes.io/projected/31d3da10-44f5-48ce-8279-6565217f5ab2-kube-api-access-vfvqx\") pod \"cinder-db-create-zg54r\" (UID: \"31d3da10-44f5-48ce-8279-6565217f5ab2\") " pod="openstack/cinder-db-create-zg54r" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.647598 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psqgd\" (UniqueName: \"kubernetes.io/projected/c1373796-a25a-406b-a417-a26ff42bbce4-kube-api-access-psqgd\") pod \"cloudkitty-db-create-9cgvr\" (UID: \"c1373796-a25a-406b-a417-a26ff42bbce4\") " pod="openstack/cloudkitty-db-create-9cgvr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.647635 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1373796-a25a-406b-a417-a26ff42bbce4-operator-scripts\") pod \"cloudkitty-db-create-9cgvr\" (UID: \"c1373796-a25a-406b-a417-a26ff42bbce4\") " pod="openstack/cloudkitty-db-create-9cgvr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.647655 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31d3da10-44f5-48ce-8279-6565217f5ab2-operator-scripts\") pod \"cinder-db-create-zg54r\" (UID: \"31d3da10-44f5-48ce-8279-6565217f5ab2\") " pod="openstack/cinder-db-create-zg54r" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.648586 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1373796-a25a-406b-a417-a26ff42bbce4-operator-scripts\") pod \"cloudkitty-db-create-9cgvr\" (UID: \"c1373796-a25a-406b-a417-a26ff42bbce4\") " pod="openstack/cloudkitty-db-create-9cgvr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.668121 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psqgd\" (UniqueName: \"kubernetes.io/projected/c1373796-a25a-406b-a417-a26ff42bbce4-kube-api-access-psqgd\") pod \"cloudkitty-db-create-9cgvr\" (UID: \"c1373796-a25a-406b-a417-a26ff42bbce4\") " pod="openstack/cloudkitty-db-create-9cgvr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.683055 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-4nvj6"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.684606 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4nvj6" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.688072 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-9cgvr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.692241 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.692451 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d9p7q" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.692586 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.692645 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.700288 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4nvj6"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.750551 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca05068-9163-4e5f-abc8-c98462b3b6c8-operator-scripts\") pod \"cinder-f8fc-account-create-update-drksr\" (UID: \"cca05068-9163-4e5f-abc8-c98462b3b6c8\") " pod="openstack/cinder-f8fc-account-create-update-drksr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.750586 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frdmw\" (UniqueName: \"kubernetes.io/projected/cca05068-9163-4e5f-abc8-c98462b3b6c8-kube-api-access-frdmw\") pod \"cinder-f8fc-account-create-update-drksr\" (UID: \"cca05068-9163-4e5f-abc8-c98462b3b6c8\") " pod="openstack/cinder-f8fc-account-create-update-drksr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.750615 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfvqx\" (UniqueName: \"kubernetes.io/projected/31d3da10-44f5-48ce-8279-6565217f5ab2-kube-api-access-vfvqx\") pod \"cinder-db-create-zg54r\" (UID: \"31d3da10-44f5-48ce-8279-6565217f5ab2\") " pod="openstack/cinder-db-create-zg54r" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.750813 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31d3da10-44f5-48ce-8279-6565217f5ab2-operator-scripts\") pod \"cinder-db-create-zg54r\" (UID: \"31d3da10-44f5-48ce-8279-6565217f5ab2\") " pod="openstack/cinder-db-create-zg54r" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.751834 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31d3da10-44f5-48ce-8279-6565217f5ab2-operator-scripts\") pod \"cinder-db-create-zg54r\" (UID: \"31d3da10-44f5-48ce-8279-6565217f5ab2\") " pod="openstack/cinder-db-create-zg54r" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.760466 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-gfsht"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.767773 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gfsht" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.782168 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfvqx\" (UniqueName: \"kubernetes.io/projected/31d3da10-44f5-48ce-8279-6565217f5ab2-kube-api-access-vfvqx\") pod \"cinder-db-create-zg54r\" (UID: \"31d3da10-44f5-48ce-8279-6565217f5ab2\") " pod="openstack/cinder-db-create-zg54r" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.799555 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gfsht"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.799994 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zg54r" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.845069 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-cj6fx"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.846152 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cj6fx" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.853909 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bv42\" (UniqueName: \"kubernetes.io/projected/88c34e87-5116-4387-9e2b-e5fbcedb6f55-kube-api-access-2bv42\") pod \"keystone-db-sync-4nvj6\" (UID: \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\") " pod="openstack/keystone-db-sync-4nvj6" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.854004 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c34e87-5116-4387-9e2b-e5fbcedb6f55-combined-ca-bundle\") pod \"keystone-db-sync-4nvj6\" (UID: \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\") " pod="openstack/keystone-db-sync-4nvj6" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.854068 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88c34e87-5116-4387-9e2b-e5fbcedb6f55-config-data\") pod \"keystone-db-sync-4nvj6\" (UID: \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\") " pod="openstack/keystone-db-sync-4nvj6" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.854160 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca05068-9163-4e5f-abc8-c98462b3b6c8-operator-scripts\") pod \"cinder-f8fc-account-create-update-drksr\" (UID: \"cca05068-9163-4e5f-abc8-c98462b3b6c8\") " pod="openstack/cinder-f8fc-account-create-update-drksr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.854184 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frdmw\" (UniqueName: \"kubernetes.io/projected/cca05068-9163-4e5f-abc8-c98462b3b6c8-kube-api-access-frdmw\") pod \"cinder-f8fc-account-create-update-drksr\" (UID: \"cca05068-9163-4e5f-abc8-c98462b3b6c8\") " pod="openstack/cinder-f8fc-account-create-update-drksr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.855568 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca05068-9163-4e5f-abc8-c98462b3b6c8-operator-scripts\") pod \"cinder-f8fc-account-create-update-drksr\" (UID: \"cca05068-9163-4e5f-abc8-c98462b3b6c8\") " pod="openstack/cinder-f8fc-account-create-update-drksr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.866572 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cj6fx"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.888009 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-a261-account-create-update-9j4jx"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.889308 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-a261-account-create-update-9j4jx" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.894762 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.913826 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frdmw\" (UniqueName: \"kubernetes.io/projected/cca05068-9163-4e5f-abc8-c98462b3b6c8-kube-api-access-frdmw\") pod \"cinder-f8fc-account-create-update-drksr\" (UID: \"cca05068-9163-4e5f-abc8-c98462b3b6c8\") " pod="openstack/cinder-f8fc-account-create-update-drksr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.929011 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f8fc-account-create-update-drksr" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.931558 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-a261-account-create-update-9j4jx"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.955368 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88c34e87-5116-4387-9e2b-e5fbcedb6f55-config-data\") pod \"keystone-db-sync-4nvj6\" (UID: \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\") " pod="openstack/keystone-db-sync-4nvj6" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.955444 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnjq6\" (UniqueName: \"kubernetes.io/projected/296328ce-10bb-42d4-a3e6-3b4986e9b944-kube-api-access-nnjq6\") pod \"neutron-db-create-gfsht\" (UID: \"296328ce-10bb-42d4-a3e6-3b4986e9b944\") " pod="openstack/neutron-db-create-gfsht" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.955476 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/296328ce-10bb-42d4-a3e6-3b4986e9b944-operator-scripts\") pod \"neutron-db-create-gfsht\" (UID: \"296328ce-10bb-42d4-a3e6-3b4986e9b944\") " pod="openstack/neutron-db-create-gfsht" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.955523 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzxp5\" (UniqueName: \"kubernetes.io/projected/2827afe8-442b-4aa9-95f6-48ef3c9a3995-kube-api-access-jzxp5\") pod \"barbican-db-create-cj6fx\" (UID: \"2827afe8-442b-4aa9-95f6-48ef3c9a3995\") " pod="openstack/barbican-db-create-cj6fx" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.955582 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2827afe8-442b-4aa9-95f6-48ef3c9a3995-operator-scripts\") pod \"barbican-db-create-cj6fx\" (UID: \"2827afe8-442b-4aa9-95f6-48ef3c9a3995\") " pod="openstack/barbican-db-create-cj6fx" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.955621 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bv42\" (UniqueName: \"kubernetes.io/projected/88c34e87-5116-4387-9e2b-e5fbcedb6f55-kube-api-access-2bv42\") pod \"keystone-db-sync-4nvj6\" (UID: \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\") " pod="openstack/keystone-db-sync-4nvj6" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.955671 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c34e87-5116-4387-9e2b-e5fbcedb6f55-combined-ca-bundle\") pod \"keystone-db-sync-4nvj6\" (UID: \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\") " pod="openstack/keystone-db-sync-4nvj6" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.959375 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c34e87-5116-4387-9e2b-e5fbcedb6f55-combined-ca-bundle\") pod \"keystone-db-sync-4nvj6\" (UID: \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\") " pod="openstack/keystone-db-sync-4nvj6" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.971419 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88c34e87-5116-4387-9e2b-e5fbcedb6f55-config-data\") pod \"keystone-db-sync-4nvj6\" (UID: \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\") " pod="openstack/keystone-db-sync-4nvj6" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.974205 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-10df-account-create-update-zbrbt"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.975567 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-10df-account-create-update-zbrbt" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.978802 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.987289 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-10df-account-create-update-zbrbt"] Feb 25 16:11:23 crc kubenswrapper[4937]: I0225 16:11:23.997895 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bv42\" (UniqueName: \"kubernetes.io/projected/88c34e87-5116-4387-9e2b-e5fbcedb6f55-kube-api-access-2bv42\") pod \"keystone-db-sync-4nvj6\" (UID: \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\") " pod="openstack/keystone-db-sync-4nvj6" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.055342 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4nvj6" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.057193 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fceee1fa-35a1-4b5d-aca8-054ab1816927-operator-scripts\") pod \"cloudkitty-a261-account-create-update-9j4jx\" (UID: \"fceee1fa-35a1-4b5d-aca8-054ab1816927\") " pod="openstack/cloudkitty-a261-account-create-update-9j4jx" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.057240 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnjq6\" (UniqueName: \"kubernetes.io/projected/296328ce-10bb-42d4-a3e6-3b4986e9b944-kube-api-access-nnjq6\") pod \"neutron-db-create-gfsht\" (UID: \"296328ce-10bb-42d4-a3e6-3b4986e9b944\") " pod="openstack/neutron-db-create-gfsht" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.057274 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/296328ce-10bb-42d4-a3e6-3b4986e9b944-operator-scripts\") pod \"neutron-db-create-gfsht\" (UID: \"296328ce-10bb-42d4-a3e6-3b4986e9b944\") " pod="openstack/neutron-db-create-gfsht" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.057300 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzxp5\" (UniqueName: \"kubernetes.io/projected/2827afe8-442b-4aa9-95f6-48ef3c9a3995-kube-api-access-jzxp5\") pod \"barbican-db-create-cj6fx\" (UID: \"2827afe8-442b-4aa9-95f6-48ef3c9a3995\") " pod="openstack/barbican-db-create-cj6fx" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.057343 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2827afe8-442b-4aa9-95f6-48ef3c9a3995-operator-scripts\") pod \"barbican-db-create-cj6fx\" (UID: \"2827afe8-442b-4aa9-95f6-48ef3c9a3995\") " pod="openstack/barbican-db-create-cj6fx" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.057431 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-746hx\" (UniqueName: \"kubernetes.io/projected/fceee1fa-35a1-4b5d-aca8-054ab1816927-kube-api-access-746hx\") pod \"cloudkitty-a261-account-create-update-9j4jx\" (UID: \"fceee1fa-35a1-4b5d-aca8-054ab1816927\") " pod="openstack/cloudkitty-a261-account-create-update-9j4jx" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.058273 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/296328ce-10bb-42d4-a3e6-3b4986e9b944-operator-scripts\") pod \"neutron-db-create-gfsht\" (UID: \"296328ce-10bb-42d4-a3e6-3b4986e9b944\") " pod="openstack/neutron-db-create-gfsht" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.061601 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1efe-account-create-update-wh56p"] Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.061907 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2827afe8-442b-4aa9-95f6-48ef3c9a3995-operator-scripts\") pod \"barbican-db-create-cj6fx\" (UID: \"2827afe8-442b-4aa9-95f6-48ef3c9a3995\") " pod="openstack/barbican-db-create-cj6fx" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.065866 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1efe-account-create-update-wh56p" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.069149 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.079477 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1efe-account-create-update-wh56p"] Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.093162 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnjq6\" (UniqueName: \"kubernetes.io/projected/296328ce-10bb-42d4-a3e6-3b4986e9b944-kube-api-access-nnjq6\") pod \"neutron-db-create-gfsht\" (UID: \"296328ce-10bb-42d4-a3e6-3b4986e9b944\") " pod="openstack/neutron-db-create-gfsht" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.093665 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzxp5\" (UniqueName: \"kubernetes.io/projected/2827afe8-442b-4aa9-95f6-48ef3c9a3995-kube-api-access-jzxp5\") pod \"barbican-db-create-cj6fx\" (UID: \"2827afe8-442b-4aa9-95f6-48ef3c9a3995\") " pod="openstack/barbican-db-create-cj6fx" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.134620 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gfsht" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.159538 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-746hx\" (UniqueName: \"kubernetes.io/projected/fceee1fa-35a1-4b5d-aca8-054ab1816927-kube-api-access-746hx\") pod \"cloudkitty-a261-account-create-update-9j4jx\" (UID: \"fceee1fa-35a1-4b5d-aca8-054ab1816927\") " pod="openstack/cloudkitty-a261-account-create-update-9j4jx" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.159614 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f262d7a-313f-4cdc-8a9a-5a5765fb3da0-operator-scripts\") pod \"barbican-10df-account-create-update-zbrbt\" (UID: \"2f262d7a-313f-4cdc-8a9a-5a5765fb3da0\") " pod="openstack/barbican-10df-account-create-update-zbrbt" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.159652 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fceee1fa-35a1-4b5d-aca8-054ab1816927-operator-scripts\") pod \"cloudkitty-a261-account-create-update-9j4jx\" (UID: \"fceee1fa-35a1-4b5d-aca8-054ab1816927\") " pod="openstack/cloudkitty-a261-account-create-update-9j4jx" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.159785 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j4r2\" (UniqueName: \"kubernetes.io/projected/2f262d7a-313f-4cdc-8a9a-5a5765fb3da0-kube-api-access-7j4r2\") pod \"barbican-10df-account-create-update-zbrbt\" (UID: \"2f262d7a-313f-4cdc-8a9a-5a5765fb3da0\") " pod="openstack/barbican-10df-account-create-update-zbrbt" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.160804 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fceee1fa-35a1-4b5d-aca8-054ab1816927-operator-scripts\") pod \"cloudkitty-a261-account-create-update-9j4jx\" (UID: \"fceee1fa-35a1-4b5d-aca8-054ab1816927\") " pod="openstack/cloudkitty-a261-account-create-update-9j4jx" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.170895 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cj6fx" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.182809 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-746hx\" (UniqueName: \"kubernetes.io/projected/fceee1fa-35a1-4b5d-aca8-054ab1816927-kube-api-access-746hx\") pod \"cloudkitty-a261-account-create-update-9j4jx\" (UID: \"fceee1fa-35a1-4b5d-aca8-054ab1816927\") " pod="openstack/cloudkitty-a261-account-create-update-9j4jx" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.245289 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-a261-account-create-update-9j4jx" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.261611 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c517a3-e534-4a32-aaa1-23ad2d42fc10-operator-scripts\") pod \"neutron-1efe-account-create-update-wh56p\" (UID: \"a1c517a3-e534-4a32-aaa1-23ad2d42fc10\") " pod="openstack/neutron-1efe-account-create-update-wh56p" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.261677 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbjz4\" (UniqueName: \"kubernetes.io/projected/a1c517a3-e534-4a32-aaa1-23ad2d42fc10-kube-api-access-mbjz4\") pod \"neutron-1efe-account-create-update-wh56p\" (UID: \"a1c517a3-e534-4a32-aaa1-23ad2d42fc10\") " pod="openstack/neutron-1efe-account-create-update-wh56p" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.261708 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j4r2\" (UniqueName: \"kubernetes.io/projected/2f262d7a-313f-4cdc-8a9a-5a5765fb3da0-kube-api-access-7j4r2\") pod \"barbican-10df-account-create-update-zbrbt\" (UID: \"2f262d7a-313f-4cdc-8a9a-5a5765fb3da0\") " pod="openstack/barbican-10df-account-create-update-zbrbt" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.261773 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f262d7a-313f-4cdc-8a9a-5a5765fb3da0-operator-scripts\") pod \"barbican-10df-account-create-update-zbrbt\" (UID: \"2f262d7a-313f-4cdc-8a9a-5a5765fb3da0\") " pod="openstack/barbican-10df-account-create-update-zbrbt" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.262422 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f262d7a-313f-4cdc-8a9a-5a5765fb3da0-operator-scripts\") pod \"barbican-10df-account-create-update-zbrbt\" (UID: \"2f262d7a-313f-4cdc-8a9a-5a5765fb3da0\") " pod="openstack/barbican-10df-account-create-update-zbrbt" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.282277 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j4r2\" (UniqueName: \"kubernetes.io/projected/2f262d7a-313f-4cdc-8a9a-5a5765fb3da0-kube-api-access-7j4r2\") pod \"barbican-10df-account-create-update-zbrbt\" (UID: \"2f262d7a-313f-4cdc-8a9a-5a5765fb3da0\") " pod="openstack/barbican-10df-account-create-update-zbrbt" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.354087 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-10df-account-create-update-zbrbt" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.363425 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbjz4\" (UniqueName: \"kubernetes.io/projected/a1c517a3-e534-4a32-aaa1-23ad2d42fc10-kube-api-access-mbjz4\") pod \"neutron-1efe-account-create-update-wh56p\" (UID: \"a1c517a3-e534-4a32-aaa1-23ad2d42fc10\") " pod="openstack/neutron-1efe-account-create-update-wh56p" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.363681 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c517a3-e534-4a32-aaa1-23ad2d42fc10-operator-scripts\") pod \"neutron-1efe-account-create-update-wh56p\" (UID: \"a1c517a3-e534-4a32-aaa1-23ad2d42fc10\") " pod="openstack/neutron-1efe-account-create-update-wh56p" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.364422 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c517a3-e534-4a32-aaa1-23ad2d42fc10-operator-scripts\") pod \"neutron-1efe-account-create-update-wh56p\" (UID: \"a1c517a3-e534-4a32-aaa1-23ad2d42fc10\") " pod="openstack/neutron-1efe-account-create-update-wh56p" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.394689 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbjz4\" (UniqueName: \"kubernetes.io/projected/a1c517a3-e534-4a32-aaa1-23ad2d42fc10-kube-api-access-mbjz4\") pod \"neutron-1efe-account-create-update-wh56p\" (UID: \"a1c517a3-e534-4a32-aaa1-23ad2d42fc10\") " pod="openstack/neutron-1efe-account-create-update-wh56p" Feb 25 16:11:24 crc kubenswrapper[4937]: I0225 16:11:24.690016 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1efe-account-create-update-wh56p" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.043793 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.136740 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vn2l\" (UniqueName: \"kubernetes.io/projected/0a0f0530-95e1-4231-9933-bedb49b72a88-kube-api-access-2vn2l\") pod \"0a0f0530-95e1-4231-9933-bedb49b72a88\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.136839 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-swiftconf\") pod \"0a0f0530-95e1-4231-9933-bedb49b72a88\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.136871 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a0f0530-95e1-4231-9933-bedb49b72a88-ring-data-devices\") pod \"0a0f0530-95e1-4231-9933-bedb49b72a88\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.136938 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-dispersionconf\") pod \"0a0f0530-95e1-4231-9933-bedb49b72a88\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.136997 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a0f0530-95e1-4231-9933-bedb49b72a88-etc-swift\") pod \"0a0f0530-95e1-4231-9933-bedb49b72a88\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.137036 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a0f0530-95e1-4231-9933-bedb49b72a88-scripts\") pod \"0a0f0530-95e1-4231-9933-bedb49b72a88\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.137152 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-combined-ca-bundle\") pod \"0a0f0530-95e1-4231-9933-bedb49b72a88\" (UID: \"0a0f0530-95e1-4231-9933-bedb49b72a88\") " Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.137885 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a0f0530-95e1-4231-9933-bedb49b72a88-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0a0f0530-95e1-4231-9933-bedb49b72a88" (UID: "0a0f0530-95e1-4231-9933-bedb49b72a88"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.138093 4937 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a0f0530-95e1-4231-9933-bedb49b72a88-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.138105 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a0f0530-95e1-4231-9933-bedb49b72a88-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0a0f0530-95e1-4231-9933-bedb49b72a88" (UID: "0a0f0530-95e1-4231-9933-bedb49b72a88"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.142510 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a0f0530-95e1-4231-9933-bedb49b72a88-kube-api-access-2vn2l" (OuterVolumeSpecName: "kube-api-access-2vn2l") pod "0a0f0530-95e1-4231-9933-bedb49b72a88" (UID: "0a0f0530-95e1-4231-9933-bedb49b72a88"). InnerVolumeSpecName "kube-api-access-2vn2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.146013 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0a0f0530-95e1-4231-9933-bedb49b72a88" (UID: "0a0f0530-95e1-4231-9933-bedb49b72a88"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.160226 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a0f0530-95e1-4231-9933-bedb49b72a88-scripts" (OuterVolumeSpecName: "scripts") pod "0a0f0530-95e1-4231-9933-bedb49b72a88" (UID: "0a0f0530-95e1-4231-9933-bedb49b72a88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.165823 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a0f0530-95e1-4231-9933-bedb49b72a88" (UID: "0a0f0530-95e1-4231-9933-bedb49b72a88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.166101 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0a0f0530-95e1-4231-9933-bedb49b72a88" (UID: "0a0f0530-95e1-4231-9933-bedb49b72a88"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.210953 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6nkbm" event={"ID":"0a0f0530-95e1-4231-9933-bedb49b72a88","Type":"ContainerDied","Data":"ea22e5cce9aff8dddd101788c467357c52c58b4ffe43e782228d224a978ec10a"} Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.210995 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea22e5cce9aff8dddd101788c467357c52c58b4ffe43e782228d224a978ec10a" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.211102 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6nkbm" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.239925 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vn2l\" (UniqueName: \"kubernetes.io/projected/0a0f0530-95e1-4231-9933-bedb49b72a88-kube-api-access-2vn2l\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.239951 4937 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.239960 4937 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.239969 4937 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a0f0530-95e1-4231-9933-bedb49b72a88-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.239978 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a0f0530-95e1-4231-9933-bedb49b72a88-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:28 crc kubenswrapper[4937]: I0225 16:11:28.239985 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0f0530-95e1-4231-9933-bedb49b72a88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:30 crc kubenswrapper[4937]: E0225 16:11:30.247871 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741" Feb 25 16:11:30 crc kubenswrapper[4937]: E0225 16:11:30.248348 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus,Image:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741,Command:[],Args:[--config.file=/etc/prometheus/config_out/prometheus.env.yaml --web.enable-lifecycle --web.route-prefix=/ --storage.tsdb.retention.time=24h --storage.tsdb.path=/prometheus --web.config.file=/etc/prometheus/web_config/web-config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:web,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-out,ReadOnly:true,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-assets,ReadOnly:true,MountPath:/etc/prometheus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-db,ReadOnly:false,MountPath:/prometheus,SubPath:prometheus-db,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:true,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-1,ReadOnly:true,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-1,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-2,ReadOnly:true,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-2,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:web-config,ReadOnly:true,MountPath:/etc/prometheus/web_config/web-config.yaml,SubPath:web-config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zmxns,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/healthy,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:15,SuccessThreshold:1,FailureThreshold:60,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(e19ac505-41d9-4d1d-b75a-0c88e26960c8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.326144 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.483182 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-log-ovn\") pod \"79be1a43-3786-4cf6-8127-016cb865312c\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.483541 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79be1a43-3786-4cf6-8127-016cb865312c-scripts\") pod \"79be1a43-3786-4cf6-8127-016cb865312c\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.483602 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s24q2\" (UniqueName: \"kubernetes.io/projected/79be1a43-3786-4cf6-8127-016cb865312c-kube-api-access-s24q2\") pod \"79be1a43-3786-4cf6-8127-016cb865312c\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.483698 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79be1a43-3786-4cf6-8127-016cb865312c-additional-scripts\") pod \"79be1a43-3786-4cf6-8127-016cb865312c\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.483732 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-run\") pod \"79be1a43-3786-4cf6-8127-016cb865312c\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.483818 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-run-ovn\") pod \"79be1a43-3786-4cf6-8127-016cb865312c\" (UID: \"79be1a43-3786-4cf6-8127-016cb865312c\") " Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.483889 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "79be1a43-3786-4cf6-8127-016cb865312c" (UID: "79be1a43-3786-4cf6-8127-016cb865312c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.484399 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-run" (OuterVolumeSpecName: "var-run") pod "79be1a43-3786-4cf6-8127-016cb865312c" (UID: "79be1a43-3786-4cf6-8127-016cb865312c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.484891 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79be1a43-3786-4cf6-8127-016cb865312c-scripts" (OuterVolumeSpecName: "scripts") pod "79be1a43-3786-4cf6-8127-016cb865312c" (UID: "79be1a43-3786-4cf6-8127-016cb865312c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.485405 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79be1a43-3786-4cf6-8127-016cb865312c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "79be1a43-3786-4cf6-8127-016cb865312c" (UID: "79be1a43-3786-4cf6-8127-016cb865312c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.485688 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "79be1a43-3786-4cf6-8127-016cb865312c" (UID: "79be1a43-3786-4cf6-8127-016cb865312c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.486454 4937 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.486496 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79be1a43-3786-4cf6-8127-016cb865312c-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.486510 4937 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/79be1a43-3786-4cf6-8127-016cb865312c-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.486523 4937 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-run\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.486534 4937 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79be1a43-3786-4cf6-8127-016cb865312c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.496981 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79be1a43-3786-4cf6-8127-016cb865312c-kube-api-access-s24q2" (OuterVolumeSpecName: "kube-api-access-s24q2") pod "79be1a43-3786-4cf6-8127-016cb865312c" (UID: "79be1a43-3786-4cf6-8127-016cb865312c"). InnerVolumeSpecName "kube-api-access-s24q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.588515 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s24q2\" (UniqueName: \"kubernetes.io/projected/79be1a43-3786-4cf6-8127-016cb865312c-kube-api-access-s24q2\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:30 crc kubenswrapper[4937]: W0225 16:11:30.735537 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2827afe8_442b_4aa9_95f6_48ef3c9a3995.slice/crio-e4f51597a992546a17a9210209356b388f3c14bc68e44d3d78e151ffafa6ce64 WatchSource:0}: Error finding container e4f51597a992546a17a9210209356b388f3c14bc68e44d3d78e151ffafa6ce64: Status 404 returned error can't find the container with id e4f51597a992546a17a9210209356b388f3c14bc68e44d3d78e151ffafa6ce64 Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.742724 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cj6fx"] Feb 25 16:11:30 crc kubenswrapper[4937]: I0225 16:11:30.967575 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gbxlp"] Feb 25 16:11:30 crc kubenswrapper[4937]: W0225 16:11:30.990056 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod231f1566_c91c_47f6_9ef5_5a9fbc5b0c57.slice/crio-2d8f4f53193a6203cc2a6a8081a1a70772a3af38be726b0f4e9f5e3692192723 WatchSource:0}: Error finding container 2d8f4f53193a6203cc2a6a8081a1a70772a3af38be726b0f4e9f5e3692192723: Status 404 returned error can't find the container with id 2d8f4f53193a6203cc2a6a8081a1a70772a3af38be726b0f4e9f5e3692192723 Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.007980 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1efe-account-create-update-wh56p"] Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.012041 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.022620 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-9cgvr"] Feb 25 16:11:31 crc kubenswrapper[4937]: W0225 16:11:31.026980 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1c517a3_e534_4a32_aaa1_23ad2d42fc10.slice/crio-7af075de6e4ed3e5bcceb80f994f38f001a99b5567dd73f4a35fd1b2b471ebf0 WatchSource:0}: Error finding container 7af075de6e4ed3e5bcceb80f994f38f001a99b5567dd73f4a35fd1b2b471ebf0: Status 404 returned error can't find the container with id 7af075de6e4ed3e5bcceb80f994f38f001a99b5567dd73f4a35fd1b2b471ebf0 Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.037716 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f8fc-account-create-update-drksr"] Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.219683 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zg54r"] Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.251107 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-10df-account-create-update-zbrbt"] Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.255936 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zg54r" event={"ID":"31d3da10-44f5-48ce-8279-6565217f5ab2","Type":"ContainerStarted","Data":"fc657a3ca945aae15d36600b0cf814e5aaba1e563eaaceea3d96a3b45c2a9298"} Feb 25 16:11:31 crc kubenswrapper[4937]: W0225 16:11:31.257380 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f262d7a_313f_4cdc_8a9a_5a5765fb3da0.slice/crio-f1c4a25ad0851b1f4249db2a7df01887e36aaae81646f83af459ba763624f695 WatchSource:0}: Error finding container f1c4a25ad0851b1f4249db2a7df01887e36aaae81646f83af459ba763624f695: Status 404 returned error can't find the container with id f1c4a25ad0851b1f4249db2a7df01887e36aaae81646f83af459ba763624f695 Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.263146 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gbxlp" event={"ID":"231f1566-c91c-47f6-9ef5-5a9fbc5b0c57","Type":"ContainerStarted","Data":"2d8f4f53193a6203cc2a6a8081a1a70772a3af38be726b0f4e9f5e3692192723"} Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.273219 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sgdhb-config-5pqxn" event={"ID":"79be1a43-3786-4cf6-8127-016cb865312c","Type":"ContainerDied","Data":"7bab99daa9b795d3a3cf832863dace83e6cfe05a6e0bf280fd482cb8c8f6de72"} Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.273419 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bab99daa9b795d3a3cf832863dace83e6cfe05a6e0bf280fd482cb8c8f6de72" Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.273282 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sgdhb-config-5pqxn" Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.277156 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4nvj6"] Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.285756 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-a261-account-create-update-9j4jx"] Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.287066 4937 generic.go:334] "Generic (PLEG): container finished" podID="2827afe8-442b-4aa9-95f6-48ef3c9a3995" containerID="75a9b7ba6a480607e978fc6e8fb11bd9b61f37f954d1a73eb683ecd0a4a46fd4" exitCode=0 Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.287177 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cj6fx" event={"ID":"2827afe8-442b-4aa9-95f6-48ef3c9a3995","Type":"ContainerDied","Data":"75a9b7ba6a480607e978fc6e8fb11bd9b61f37f954d1a73eb683ecd0a4a46fd4"} Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.287231 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cj6fx" event={"ID":"2827afe8-442b-4aa9-95f6-48ef3c9a3995","Type":"ContainerStarted","Data":"e4f51597a992546a17a9210209356b388f3c14bc68e44d3d78e151ffafa6ce64"} Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.292370 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1efe-account-create-update-wh56p" event={"ID":"a1c517a3-e534-4a32-aaa1-23ad2d42fc10","Type":"ContainerStarted","Data":"7af075de6e4ed3e5bcceb80f994f38f001a99b5567dd73f4a35fd1b2b471ebf0"} Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.293918 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-9cgvr" event={"ID":"c1373796-a25a-406b-a417-a26ff42bbce4","Type":"ContainerStarted","Data":"bd90add7ac624dfea1d8ff67607e4244b0b24e77c591ce8a88bae579fe7be124"} Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.295158 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f8fc-account-create-update-drksr" event={"ID":"cca05068-9163-4e5f-abc8-c98462b3b6c8","Type":"ContainerStarted","Data":"f1955e8bbfa3ab2e3bba926c6dd8f0154dd526464f345a674ddadf43718b7134"} Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.299729 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8ce79682-d3ee-4afb-ba50-fdacc0fe6910","Type":"ContainerStarted","Data":"5f13f0147fc36b42229b77315eb4459b87bc8977f8c53d98d6e3540ff268c258"} Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.300664 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.324268 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.6500116460000003 podStartE2EDuration="50.324252474s" podCreationTimestamp="2026-02-25 16:10:41 +0000 UTC" firstStartedPulling="2026-02-25 16:10:42.785278311 +0000 UTC m=+1493.798670201" lastFinishedPulling="2026-02-25 16:11:30.459519139 +0000 UTC m=+1541.472911029" observedRunningTime="2026-02-25 16:11:31.323458885 +0000 UTC m=+1542.336850785" watchObservedRunningTime="2026-02-25 16:11:31.324252474 +0000 UTC m=+1542.337644364" Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.430190 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gfsht"] Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.437835 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-sgdhb-config-5pqxn"] Feb 25 16:11:31 crc kubenswrapper[4937]: I0225 16:11:31.452675 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-sgdhb-config-5pqxn"] Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.309898 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w8hgr" event={"ID":"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d","Type":"ContainerStarted","Data":"04d2954c76170b54b664e31e54d5a6f8eeaa54b13b5ee8c72c544d7f49d75bda"} Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.312981 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4nvj6" event={"ID":"88c34e87-5116-4387-9e2b-e5fbcedb6f55","Type":"ContainerStarted","Data":"9632a20f817f50214cc1a6bdf43c1931d70e235e65aaecba6b6230df70ab7ca2"} Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.314613 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gbxlp" event={"ID":"231f1566-c91c-47f6-9ef5-5a9fbc5b0c57","Type":"ContainerStarted","Data":"5f67fb26717060327cc02c8f7a5125a3d6874d9200c9e2c5153551570e7d7ff0"} Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.317109 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-a261-account-create-update-9j4jx" event={"ID":"fceee1fa-35a1-4b5d-aca8-054ab1816927","Type":"ContainerStarted","Data":"274b9ce56fc7b77d6878d871391c4f2e4371c69ba071de597b85a964120f89ac"} Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.317176 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-a261-account-create-update-9j4jx" event={"ID":"fceee1fa-35a1-4b5d-aca8-054ab1816927","Type":"ContainerStarted","Data":"08087ab24dac7880092062dfa85c01b71f6d9528728be5889b6deadcee8604d6"} Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.318846 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-9cgvr" event={"ID":"c1373796-a25a-406b-a417-a26ff42bbce4","Type":"ContainerStarted","Data":"d28f74bba9cf2132a8c1737596d2cb91106e993882316795fd11a1578d272172"} Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.320564 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1efe-account-create-update-wh56p" event={"ID":"a1c517a3-e534-4a32-aaa1-23ad2d42fc10","Type":"ContainerStarted","Data":"bac0b4953d10367c1e1de1365b214139835f7a2eb291f9aad46dfc9a9da07968"} Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.322835 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gfsht" event={"ID":"296328ce-10bb-42d4-a3e6-3b4986e9b944","Type":"ContainerStarted","Data":"cb23e5ae864b685389756b3396da69b0ea65114200286df72ea4a90a4291c56c"} Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.322879 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gfsht" event={"ID":"296328ce-10bb-42d4-a3e6-3b4986e9b944","Type":"ContainerStarted","Data":"2fb2d378c2cd16b740e1aa53966ba68841cec50b33546c7fa6b9f76255c2f5a4"} Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.330744 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-10df-account-create-update-zbrbt" event={"ID":"2f262d7a-313f-4cdc-8a9a-5a5765fb3da0","Type":"ContainerStarted","Data":"a2478a6237d1252b5ffeaeaefc5c4093115de93e2dfa46d9ab0e1cae15835b1a"} Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.330788 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-10df-account-create-update-zbrbt" event={"ID":"2f262d7a-313f-4cdc-8a9a-5a5765fb3da0","Type":"ContainerStarted","Data":"f1c4a25ad0851b1f4249db2a7df01887e36aaae81646f83af459ba763624f695"} Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.336061 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f8fc-account-create-update-drksr" event={"ID":"cca05068-9163-4e5f-abc8-c98462b3b6c8","Type":"ContainerStarted","Data":"f20c055947b7a96675aa34ee94834d7d0eb316b3207ca30e8ed95725a2eac4c8"} Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.337620 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-w8hgr" podStartSLOduration=3.5020784000000003 podStartE2EDuration="21.337597465s" podCreationTimestamp="2026-02-25 16:11:11 +0000 UTC" firstStartedPulling="2026-02-25 16:11:12.556053202 +0000 UTC m=+1523.569445092" lastFinishedPulling="2026-02-25 16:11:30.391572267 +0000 UTC m=+1541.404964157" observedRunningTime="2026-02-25 16:11:32.326803094 +0000 UTC m=+1543.340194984" watchObservedRunningTime="2026-02-25 16:11:32.337597465 +0000 UTC m=+1543.350989365" Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.345216 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zg54r" event={"ID":"31d3da10-44f5-48ce-8279-6565217f5ab2","Type":"ContainerStarted","Data":"bb3ff9c2adb409ba539fa1cedc7bac2749456d98d0aea1cc56ea7135c55ab4b8"} Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.356448 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-create-9cgvr" podStartSLOduration=9.356432187 podStartE2EDuration="9.356432187s" podCreationTimestamp="2026-02-25 16:11:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:32.350028817 +0000 UTC m=+1543.363420717" watchObservedRunningTime="2026-02-25 16:11:32.356432187 +0000 UTC m=+1543.369824077" Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.369033 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-1efe-account-create-update-wh56p" podStartSLOduration=8.369013563 podStartE2EDuration="8.369013563s" podCreationTimestamp="2026-02-25 16:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:32.36493294 +0000 UTC m=+1543.378324850" watchObservedRunningTime="2026-02-25 16:11:32.369013563 +0000 UTC m=+1543.382405453" Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.390914 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-gfsht" podStartSLOduration=9.390894931 podStartE2EDuration="9.390894931s" podCreationTimestamp="2026-02-25 16:11:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:32.380659304 +0000 UTC m=+1543.394051204" watchObservedRunningTime="2026-02-25 16:11:32.390894931 +0000 UTC m=+1543.404286821" Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.401828 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-a261-account-create-update-9j4jx" podStartSLOduration=9.401809575 podStartE2EDuration="9.401809575s" podCreationTimestamp="2026-02-25 16:11:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:32.395539097 +0000 UTC m=+1543.408930987" watchObservedRunningTime="2026-02-25 16:11:32.401809575 +0000 UTC m=+1543.415201465" Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.420985 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-gbxlp" podStartSLOduration=13.420963885 podStartE2EDuration="13.420963885s" podCreationTimestamp="2026-02-25 16:11:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:32.416666047 +0000 UTC m=+1543.430057937" watchObservedRunningTime="2026-02-25 16:11:32.420963885 +0000 UTC m=+1543.434355775" Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.436795 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-10df-account-create-update-zbrbt" podStartSLOduration=9.436778441 podStartE2EDuration="9.436778441s" podCreationTimestamp="2026-02-25 16:11:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:32.43115428 +0000 UTC m=+1543.444546170" watchObservedRunningTime="2026-02-25 16:11:32.436778441 +0000 UTC m=+1543.450170351" Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.458115 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-f8fc-account-create-update-drksr" podStartSLOduration=9.458091255 podStartE2EDuration="9.458091255s" podCreationTimestamp="2026-02-25 16:11:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:32.446676969 +0000 UTC m=+1543.460068859" watchObservedRunningTime="2026-02-25 16:11:32.458091255 +0000 UTC m=+1543.471483145" Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.488187 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-zg54r" podStartSLOduration=9.488165159 podStartE2EDuration="9.488165159s" podCreationTimestamp="2026-02-25 16:11:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:32.476670051 +0000 UTC m=+1543.490061941" watchObservedRunningTime="2026-02-25 16:11:32.488165159 +0000 UTC m=+1543.501557049" Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.788428 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cj6fx" Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.951430 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzxp5\" (UniqueName: \"kubernetes.io/projected/2827afe8-442b-4aa9-95f6-48ef3c9a3995-kube-api-access-jzxp5\") pod \"2827afe8-442b-4aa9-95f6-48ef3c9a3995\" (UID: \"2827afe8-442b-4aa9-95f6-48ef3c9a3995\") " Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.951566 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2827afe8-442b-4aa9-95f6-48ef3c9a3995-operator-scripts\") pod \"2827afe8-442b-4aa9-95f6-48ef3c9a3995\" (UID: \"2827afe8-442b-4aa9-95f6-48ef3c9a3995\") " Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.953386 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2827afe8-442b-4aa9-95f6-48ef3c9a3995-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2827afe8-442b-4aa9-95f6-48ef3c9a3995" (UID: "2827afe8-442b-4aa9-95f6-48ef3c9a3995"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:32 crc kubenswrapper[4937]: I0225 16:11:32.972468 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2827afe8-442b-4aa9-95f6-48ef3c9a3995-kube-api-access-jzxp5" (OuterVolumeSpecName: "kube-api-access-jzxp5") pod "2827afe8-442b-4aa9-95f6-48ef3c9a3995" (UID: "2827afe8-442b-4aa9-95f6-48ef3c9a3995"). InnerVolumeSpecName "kube-api-access-jzxp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:33 crc kubenswrapper[4937]: I0225 16:11:33.054556 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzxp5\" (UniqueName: \"kubernetes.io/projected/2827afe8-442b-4aa9-95f6-48ef3c9a3995-kube-api-access-jzxp5\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:33 crc kubenswrapper[4937]: I0225 16:11:33.054612 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2827afe8-442b-4aa9-95f6-48ef3c9a3995-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:33 crc kubenswrapper[4937]: I0225 16:11:33.357530 4937 generic.go:334] "Generic (PLEG): container finished" podID="231f1566-c91c-47f6-9ef5-5a9fbc5b0c57" containerID="5f67fb26717060327cc02c8f7a5125a3d6874d9200c9e2c5153551570e7d7ff0" exitCode=0 Feb 25 16:11:33 crc kubenswrapper[4937]: I0225 16:11:33.357640 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gbxlp" event={"ID":"231f1566-c91c-47f6-9ef5-5a9fbc5b0c57","Type":"ContainerDied","Data":"5f67fb26717060327cc02c8f7a5125a3d6874d9200c9e2c5153551570e7d7ff0"} Feb 25 16:11:33 crc kubenswrapper[4937]: I0225 16:11:33.362891 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cj6fx" Feb 25 16:11:33 crc kubenswrapper[4937]: I0225 16:11:33.362896 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cj6fx" event={"ID":"2827afe8-442b-4aa9-95f6-48ef3c9a3995","Type":"ContainerDied","Data":"e4f51597a992546a17a9210209356b388f3c14bc68e44d3d78e151ffafa6ce64"} Feb 25 16:11:33 crc kubenswrapper[4937]: I0225 16:11:33.362963 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4f51597a992546a17a9210209356b388f3c14bc68e44d3d78e151ffafa6ce64" Feb 25 16:11:33 crc kubenswrapper[4937]: I0225 16:11:33.368147 4937 generic.go:334] "Generic (PLEG): container finished" podID="c1373796-a25a-406b-a417-a26ff42bbce4" containerID="d28f74bba9cf2132a8c1737596d2cb91106e993882316795fd11a1578d272172" exitCode=0 Feb 25 16:11:33 crc kubenswrapper[4937]: I0225 16:11:33.393716 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79be1a43-3786-4cf6-8127-016cb865312c" path="/var/lib/kubelet/pods/79be1a43-3786-4cf6-8127-016cb865312c/volumes" Feb 25 16:11:33 crc kubenswrapper[4937]: I0225 16:11:33.394677 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-9cgvr" event={"ID":"c1373796-a25a-406b-a417-a26ff42bbce4","Type":"ContainerDied","Data":"d28f74bba9cf2132a8c1737596d2cb91106e993882316795fd11a1578d272172"} Feb 25 16:11:34 crc kubenswrapper[4937]: I0225 16:11:34.228451 4937 scope.go:117] "RemoveContainer" containerID="8b1057707b0a9b295b1fd01c4263122dfc4d18d96bdda5e89ca220104284838f" Feb 25 16:11:34 crc kubenswrapper[4937]: I0225 16:11:34.377796 4937 generic.go:334] "Generic (PLEG): container finished" podID="31d3da10-44f5-48ce-8279-6565217f5ab2" containerID="bb3ff9c2adb409ba539fa1cedc7bac2749456d98d0aea1cc56ea7135c55ab4b8" exitCode=0 Feb 25 16:11:34 crc kubenswrapper[4937]: I0225 16:11:34.377856 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zg54r" event={"ID":"31d3da10-44f5-48ce-8279-6565217f5ab2","Type":"ContainerDied","Data":"bb3ff9c2adb409ba539fa1cedc7bac2749456d98d0aea1cc56ea7135c55ab4b8"} Feb 25 16:11:34 crc kubenswrapper[4937]: I0225 16:11:34.380521 4937 generic.go:334] "Generic (PLEG): container finished" podID="296328ce-10bb-42d4-a3e6-3b4986e9b944" containerID="cb23e5ae864b685389756b3396da69b0ea65114200286df72ea4a90a4291c56c" exitCode=0 Feb 25 16:11:34 crc kubenswrapper[4937]: I0225 16:11:34.380623 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gfsht" event={"ID":"296328ce-10bb-42d4-a3e6-3b4986e9b944","Type":"ContainerDied","Data":"cb23e5ae864b685389756b3396da69b0ea65114200286df72ea4a90a4291c56c"} Feb 25 16:11:34 crc kubenswrapper[4937]: I0225 16:11:34.381813 4937 generic.go:334] "Generic (PLEG): container finished" podID="a1c517a3-e534-4a32-aaa1-23ad2d42fc10" containerID="bac0b4953d10367c1e1de1365b214139835f7a2eb291f9aad46dfc9a9da07968" exitCode=0 Feb 25 16:11:34 crc kubenswrapper[4937]: I0225 16:11:34.381848 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1efe-account-create-update-wh56p" event={"ID":"a1c517a3-e534-4a32-aaa1-23ad2d42fc10","Type":"ContainerDied","Data":"bac0b4953d10367c1e1de1365b214139835f7a2eb291f9aad46dfc9a9da07968"} Feb 25 16:11:34 crc kubenswrapper[4937]: E0225 16:11:34.383667 4937 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1c517a3_e534_4a32_aaa1_23ad2d42fc10.slice/crio-conmon-bac0b4953d10367c1e1de1365b214139835f7a2eb291f9aad46dfc9a9da07968.scope\": RecentStats: unable to find data in memory cache]" Feb 25 16:11:35 crc kubenswrapper[4937]: I0225 16:11:35.406055 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e19ac505-41d9-4d1d-b75a-0c88e26960c8","Type":"ContainerStarted","Data":"fc97b8e06f2912e439a8df9a3ca91b06cb74e379b0a69a88fd74be43db38ea21"} Feb 25 16:11:35 crc kubenswrapper[4937]: I0225 16:11:35.407831 4937 generic.go:334] "Generic (PLEG): container finished" podID="fceee1fa-35a1-4b5d-aca8-054ab1816927" containerID="274b9ce56fc7b77d6878d871391c4f2e4371c69ba071de597b85a964120f89ac" exitCode=0 Feb 25 16:11:35 crc kubenswrapper[4937]: I0225 16:11:35.407876 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-a261-account-create-update-9j4jx" event={"ID":"fceee1fa-35a1-4b5d-aca8-054ab1816927","Type":"ContainerDied","Data":"274b9ce56fc7b77d6878d871391c4f2e4371c69ba071de597b85a964120f89ac"} Feb 25 16:11:35 crc kubenswrapper[4937]: I0225 16:11:35.409660 4937 generic.go:334] "Generic (PLEG): container finished" podID="2f262d7a-313f-4cdc-8a9a-5a5765fb3da0" containerID="a2478a6237d1252b5ffeaeaefc5c4093115de93e2dfa46d9ab0e1cae15835b1a" exitCode=0 Feb 25 16:11:35 crc kubenswrapper[4937]: I0225 16:11:35.409701 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-10df-account-create-update-zbrbt" event={"ID":"2f262d7a-313f-4cdc-8a9a-5a5765fb3da0","Type":"ContainerDied","Data":"a2478a6237d1252b5ffeaeaefc5c4093115de93e2dfa46d9ab0e1cae15835b1a"} Feb 25 16:11:35 crc kubenswrapper[4937]: I0225 16:11:35.411365 4937 generic.go:334] "Generic (PLEG): container finished" podID="cca05068-9163-4e5f-abc8-c98462b3b6c8" containerID="f20c055947b7a96675aa34ee94834d7d0eb316b3207ca30e8ed95725a2eac4c8" exitCode=0 Feb 25 16:11:35 crc kubenswrapper[4937]: I0225 16:11:35.411452 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f8fc-account-create-update-drksr" event={"ID":"cca05068-9163-4e5f-abc8-c98462b3b6c8","Type":"ContainerDied","Data":"f20c055947b7a96675aa34ee94834d7d0eb316b3207ca30e8ed95725a2eac4c8"} Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.151093 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gfsht" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.158005 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zg54r" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.188222 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gbxlp" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.203049 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-9cgvr" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.208113 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-a261-account-create-update-9j4jx" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.225263 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231f1566-c91c-47f6-9ef5-5a9fbc5b0c57-operator-scripts\") pod \"231f1566-c91c-47f6-9ef5-5a9fbc5b0c57\" (UID: \"231f1566-c91c-47f6-9ef5-5a9fbc5b0c57\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.225346 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfvqx\" (UniqueName: \"kubernetes.io/projected/31d3da10-44f5-48ce-8279-6565217f5ab2-kube-api-access-vfvqx\") pod \"31d3da10-44f5-48ce-8279-6565217f5ab2\" (UID: \"31d3da10-44f5-48ce-8279-6565217f5ab2\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.225419 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/296328ce-10bb-42d4-a3e6-3b4986e9b944-operator-scripts\") pod \"296328ce-10bb-42d4-a3e6-3b4986e9b944\" (UID: \"296328ce-10bb-42d4-a3e6-3b4986e9b944\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.226393 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/231f1566-c91c-47f6-9ef5-5a9fbc5b0c57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "231f1566-c91c-47f6-9ef5-5a9fbc5b0c57" (UID: "231f1566-c91c-47f6-9ef5-5a9fbc5b0c57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.227435 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/296328ce-10bb-42d4-a3e6-3b4986e9b944-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "296328ce-10bb-42d4-a3e6-3b4986e9b944" (UID: "296328ce-10bb-42d4-a3e6-3b4986e9b944"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.227535 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnjq6\" (UniqueName: \"kubernetes.io/projected/296328ce-10bb-42d4-a3e6-3b4986e9b944-kube-api-access-nnjq6\") pod \"296328ce-10bb-42d4-a3e6-3b4986e9b944\" (UID: \"296328ce-10bb-42d4-a3e6-3b4986e9b944\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.227644 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31d3da10-44f5-48ce-8279-6565217f5ab2-operator-scripts\") pod \"31d3da10-44f5-48ce-8279-6565217f5ab2\" (UID: \"31d3da10-44f5-48ce-8279-6565217f5ab2\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.227716 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-682k8\" (UniqueName: \"kubernetes.io/projected/231f1566-c91c-47f6-9ef5-5a9fbc5b0c57-kube-api-access-682k8\") pod \"231f1566-c91c-47f6-9ef5-5a9fbc5b0c57\" (UID: \"231f1566-c91c-47f6-9ef5-5a9fbc5b0c57\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.227826 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f8fc-account-create-update-drksr" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.231249 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d3da10-44f5-48ce-8279-6565217f5ab2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31d3da10-44f5-48ce-8279-6565217f5ab2" (UID: "31d3da10-44f5-48ce-8279-6565217f5ab2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.232744 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231f1566-c91c-47f6-9ef5-5a9fbc5b0c57-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.232779 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/296328ce-10bb-42d4-a3e6-3b4986e9b944-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.232793 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31d3da10-44f5-48ce-8279-6565217f5ab2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.238176 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-10df-account-create-update-zbrbt" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.254611 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d3da10-44f5-48ce-8279-6565217f5ab2-kube-api-access-vfvqx" (OuterVolumeSpecName: "kube-api-access-vfvqx") pod "31d3da10-44f5-48ce-8279-6565217f5ab2" (UID: "31d3da10-44f5-48ce-8279-6565217f5ab2"). InnerVolumeSpecName "kube-api-access-vfvqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.258888 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1efe-account-create-update-wh56p" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.265998 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296328ce-10bb-42d4-a3e6-3b4986e9b944-kube-api-access-nnjq6" (OuterVolumeSpecName: "kube-api-access-nnjq6") pod "296328ce-10bb-42d4-a3e6-3b4986e9b944" (UID: "296328ce-10bb-42d4-a3e6-3b4986e9b944"). InnerVolumeSpecName "kube-api-access-nnjq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.280901 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231f1566-c91c-47f6-9ef5-5a9fbc5b0c57-kube-api-access-682k8" (OuterVolumeSpecName: "kube-api-access-682k8") pod "231f1566-c91c-47f6-9ef5-5a9fbc5b0c57" (UID: "231f1566-c91c-47f6-9ef5-5a9fbc5b0c57"). InnerVolumeSpecName "kube-api-access-682k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.341800 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psqgd\" (UniqueName: \"kubernetes.io/projected/c1373796-a25a-406b-a417-a26ff42bbce4-kube-api-access-psqgd\") pod \"c1373796-a25a-406b-a417-a26ff42bbce4\" (UID: \"c1373796-a25a-406b-a417-a26ff42bbce4\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.341864 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbjz4\" (UniqueName: \"kubernetes.io/projected/a1c517a3-e534-4a32-aaa1-23ad2d42fc10-kube-api-access-mbjz4\") pod \"a1c517a3-e534-4a32-aaa1-23ad2d42fc10\" (UID: \"a1c517a3-e534-4a32-aaa1-23ad2d42fc10\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.341917 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca05068-9163-4e5f-abc8-c98462b3b6c8-operator-scripts\") pod \"cca05068-9163-4e5f-abc8-c98462b3b6c8\" (UID: \"cca05068-9163-4e5f-abc8-c98462b3b6c8\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.341966 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fceee1fa-35a1-4b5d-aca8-054ab1816927-operator-scripts\") pod \"fceee1fa-35a1-4b5d-aca8-054ab1816927\" (UID: \"fceee1fa-35a1-4b5d-aca8-054ab1816927\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.342016 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c517a3-e534-4a32-aaa1-23ad2d42fc10-operator-scripts\") pod \"a1c517a3-e534-4a32-aaa1-23ad2d42fc10\" (UID: \"a1c517a3-e534-4a32-aaa1-23ad2d42fc10\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.342050 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f262d7a-313f-4cdc-8a9a-5a5765fb3da0-operator-scripts\") pod \"2f262d7a-313f-4cdc-8a9a-5a5765fb3da0\" (UID: \"2f262d7a-313f-4cdc-8a9a-5a5765fb3da0\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.342072 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1373796-a25a-406b-a417-a26ff42bbce4-operator-scripts\") pod \"c1373796-a25a-406b-a417-a26ff42bbce4\" (UID: \"c1373796-a25a-406b-a417-a26ff42bbce4\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.342095 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j4r2\" (UniqueName: \"kubernetes.io/projected/2f262d7a-313f-4cdc-8a9a-5a5765fb3da0-kube-api-access-7j4r2\") pod \"2f262d7a-313f-4cdc-8a9a-5a5765fb3da0\" (UID: \"2f262d7a-313f-4cdc-8a9a-5a5765fb3da0\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.342152 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frdmw\" (UniqueName: \"kubernetes.io/projected/cca05068-9163-4e5f-abc8-c98462b3b6c8-kube-api-access-frdmw\") pod \"cca05068-9163-4e5f-abc8-c98462b3b6c8\" (UID: \"cca05068-9163-4e5f-abc8-c98462b3b6c8\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.342177 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-746hx\" (UniqueName: \"kubernetes.io/projected/fceee1fa-35a1-4b5d-aca8-054ab1816927-kube-api-access-746hx\") pod \"fceee1fa-35a1-4b5d-aca8-054ab1816927\" (UID: \"fceee1fa-35a1-4b5d-aca8-054ab1816927\") " Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.342785 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfvqx\" (UniqueName: \"kubernetes.io/projected/31d3da10-44f5-48ce-8279-6565217f5ab2-kube-api-access-vfvqx\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.342802 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnjq6\" (UniqueName: \"kubernetes.io/projected/296328ce-10bb-42d4-a3e6-3b4986e9b944-kube-api-access-nnjq6\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.342814 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-682k8\" (UniqueName: \"kubernetes.io/projected/231f1566-c91c-47f6-9ef5-5a9fbc5b0c57-kube-api-access-682k8\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.343125 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1c517a3-e534-4a32-aaa1-23ad2d42fc10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1c517a3-e534-4a32-aaa1-23ad2d42fc10" (UID: "a1c517a3-e534-4a32-aaa1-23ad2d42fc10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.343557 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1373796-a25a-406b-a417-a26ff42bbce4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1373796-a25a-406b-a417-a26ff42bbce4" (UID: "c1373796-a25a-406b-a417-a26ff42bbce4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.344602 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cca05068-9163-4e5f-abc8-c98462b3b6c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cca05068-9163-4e5f-abc8-c98462b3b6c8" (UID: "cca05068-9163-4e5f-abc8-c98462b3b6c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.346098 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f262d7a-313f-4cdc-8a9a-5a5765fb3da0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f262d7a-313f-4cdc-8a9a-5a5765fb3da0" (UID: "2f262d7a-313f-4cdc-8a9a-5a5765fb3da0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.348063 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fceee1fa-35a1-4b5d-aca8-054ab1816927-kube-api-access-746hx" (OuterVolumeSpecName: "kube-api-access-746hx") pod "fceee1fa-35a1-4b5d-aca8-054ab1816927" (UID: "fceee1fa-35a1-4b5d-aca8-054ab1816927"). InnerVolumeSpecName "kube-api-access-746hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.348665 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca05068-9163-4e5f-abc8-c98462b3b6c8-kube-api-access-frdmw" (OuterVolumeSpecName: "kube-api-access-frdmw") pod "cca05068-9163-4e5f-abc8-c98462b3b6c8" (UID: "cca05068-9163-4e5f-abc8-c98462b3b6c8"). InnerVolumeSpecName "kube-api-access-frdmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.348800 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c517a3-e534-4a32-aaa1-23ad2d42fc10-kube-api-access-mbjz4" (OuterVolumeSpecName: "kube-api-access-mbjz4") pod "a1c517a3-e534-4a32-aaa1-23ad2d42fc10" (UID: "a1c517a3-e534-4a32-aaa1-23ad2d42fc10"). InnerVolumeSpecName "kube-api-access-mbjz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.349744 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fceee1fa-35a1-4b5d-aca8-054ab1816927-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fceee1fa-35a1-4b5d-aca8-054ab1816927" (UID: "fceee1fa-35a1-4b5d-aca8-054ab1816927"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.349977 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1373796-a25a-406b-a417-a26ff42bbce4-kube-api-access-psqgd" (OuterVolumeSpecName: "kube-api-access-psqgd") pod "c1373796-a25a-406b-a417-a26ff42bbce4" (UID: "c1373796-a25a-406b-a417-a26ff42bbce4"). InnerVolumeSpecName "kube-api-access-psqgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.355899 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f262d7a-313f-4cdc-8a9a-5a5765fb3da0-kube-api-access-7j4r2" (OuterVolumeSpecName: "kube-api-access-7j4r2") pod "2f262d7a-313f-4cdc-8a9a-5a5765fb3da0" (UID: "2f262d7a-313f-4cdc-8a9a-5a5765fb3da0"). InnerVolumeSpecName "kube-api-access-7j4r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.443553 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4nvj6" event={"ID":"88c34e87-5116-4387-9e2b-e5fbcedb6f55","Type":"ContainerStarted","Data":"c3cf326623410f2d9aca12fc12f29fcb1221a82e4948c138eb02590abd9c8891"} Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.444545 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psqgd\" (UniqueName: \"kubernetes.io/projected/c1373796-a25a-406b-a417-a26ff42bbce4-kube-api-access-psqgd\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.444578 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbjz4\" (UniqueName: \"kubernetes.io/projected/a1c517a3-e534-4a32-aaa1-23ad2d42fc10-kube-api-access-mbjz4\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.444597 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca05068-9163-4e5f-abc8-c98462b3b6c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.444615 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fceee1fa-35a1-4b5d-aca8-054ab1816927-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.444632 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1c517a3-e534-4a32-aaa1-23ad2d42fc10-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.444648 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f262d7a-313f-4cdc-8a9a-5a5765fb3da0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.444663 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1373796-a25a-406b-a417-a26ff42bbce4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.444680 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j4r2\" (UniqueName: \"kubernetes.io/projected/2f262d7a-313f-4cdc-8a9a-5a5765fb3da0-kube-api-access-7j4r2\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.444697 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frdmw\" (UniqueName: \"kubernetes.io/projected/cca05068-9163-4e5f-abc8-c98462b3b6c8-kube-api-access-frdmw\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.444710 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-746hx\" (UniqueName: \"kubernetes.io/projected/fceee1fa-35a1-4b5d-aca8-054ab1816927-kube-api-access-746hx\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.448787 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gfsht" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.449330 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gfsht" event={"ID":"296328ce-10bb-42d4-a3e6-3b4986e9b944","Type":"ContainerDied","Data":"2fb2d378c2cd16b740e1aa53966ba68841cec50b33546c7fa6b9f76255c2f5a4"} Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.449371 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fb2d378c2cd16b740e1aa53966ba68841cec50b33546c7fa6b9f76255c2f5a4" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.459892 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f8fc-account-create-update-drksr" event={"ID":"cca05068-9163-4e5f-abc8-c98462b3b6c8","Type":"ContainerDied","Data":"f1955e8bbfa3ab2e3bba926c6dd8f0154dd526464f345a674ddadf43718b7134"} Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.459930 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1955e8bbfa3ab2e3bba926c6dd8f0154dd526464f345a674ddadf43718b7134" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.460047 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f8fc-account-create-update-drksr" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.465366 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gbxlp" event={"ID":"231f1566-c91c-47f6-9ef5-5a9fbc5b0c57","Type":"ContainerDied","Data":"2d8f4f53193a6203cc2a6a8081a1a70772a3af38be726b0f4e9f5e3692192723"} Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.465409 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d8f4f53193a6203cc2a6a8081a1a70772a3af38be726b0f4e9f5e3692192723" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.465643 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gbxlp" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.469015 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-4nvj6" podStartSLOduration=8.807470606999999 podStartE2EDuration="15.468990814s" podCreationTimestamp="2026-02-25 16:11:23 +0000 UTC" firstStartedPulling="2026-02-25 16:11:31.307385422 +0000 UTC m=+1542.320777312" lastFinishedPulling="2026-02-25 16:11:37.968905629 +0000 UTC m=+1548.982297519" observedRunningTime="2026-02-25 16:11:38.458294446 +0000 UTC m=+1549.471686326" watchObservedRunningTime="2026-02-25 16:11:38.468990814 +0000 UTC m=+1549.482382724" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.471670 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-a261-account-create-update-9j4jx" event={"ID":"fceee1fa-35a1-4b5d-aca8-054ab1816927","Type":"ContainerDied","Data":"08087ab24dac7880092062dfa85c01b71f6d9528728be5889b6deadcee8604d6"} Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.471710 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08087ab24dac7880092062dfa85c01b71f6d9528728be5889b6deadcee8604d6" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.471791 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-a261-account-create-update-9j4jx" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.478756 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-10df-account-create-update-zbrbt" event={"ID":"2f262d7a-313f-4cdc-8a9a-5a5765fb3da0","Type":"ContainerDied","Data":"f1c4a25ad0851b1f4249db2a7df01887e36aaae81646f83af459ba763624f695"} Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.478793 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1c4a25ad0851b1f4249db2a7df01887e36aaae81646f83af459ba763624f695" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.478792 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-10df-account-create-update-zbrbt" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.480217 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-9cgvr" event={"ID":"c1373796-a25a-406b-a417-a26ff42bbce4","Type":"ContainerDied","Data":"bd90add7ac624dfea1d8ff67607e4244b0b24e77c591ce8a88bae579fe7be124"} Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.480258 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd90add7ac624dfea1d8ff67607e4244b0b24e77c591ce8a88bae579fe7be124" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.480307 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-9cgvr" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.483916 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1efe-account-create-update-wh56p" event={"ID":"a1c517a3-e534-4a32-aaa1-23ad2d42fc10","Type":"ContainerDied","Data":"7af075de6e4ed3e5bcceb80f994f38f001a99b5567dd73f4a35fd1b2b471ebf0"} Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.483990 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7af075de6e4ed3e5bcceb80f994f38f001a99b5567dd73f4a35fd1b2b471ebf0" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.484066 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1efe-account-create-update-wh56p" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.486251 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zg54r" event={"ID":"31d3da10-44f5-48ce-8279-6565217f5ab2","Type":"ContainerDied","Data":"fc657a3ca945aae15d36600b0cf814e5aaba1e563eaaceea3d96a3b45c2a9298"} Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.486385 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc657a3ca945aae15d36600b0cf814e5aaba1e563eaaceea3d96a3b45c2a9298" Feb 25 16:11:38 crc kubenswrapper[4937]: I0225 16:11:38.486600 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zg54r" Feb 25 16:11:41 crc kubenswrapper[4937]: E0225 16:11:41.423342 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" Feb 25 16:11:41 crc kubenswrapper[4937]: I0225 16:11:41.515747 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e19ac505-41d9-4d1d-b75a-0c88e26960c8","Type":"ContainerStarted","Data":"ce7810c7559494ae2ce2d68e150d814e5181253265d082b3bf485bde3ffceb08"} Feb 25 16:11:42 crc kubenswrapper[4937]: I0225 16:11:42.397700 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 25 16:11:43 crc kubenswrapper[4937]: I0225 16:11:43.266388 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:11:43 crc kubenswrapper[4937]: I0225 16:11:43.276804 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/48d22af0-5579-46fb-889d-fd34e46d26e9-etc-swift\") pod \"swift-storage-0\" (UID: \"48d22af0-5579-46fb-889d-fd34e46d26e9\") " pod="openstack/swift-storage-0" Feb 25 16:11:43 crc kubenswrapper[4937]: I0225 16:11:43.495987 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 25 16:11:43 crc kubenswrapper[4937]: I0225 16:11:43.530590 4937 generic.go:334] "Generic (PLEG): container finished" podID="8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d" containerID="04d2954c76170b54b664e31e54d5a6f8eeaa54b13b5ee8c72c544d7f49d75bda" exitCode=0 Feb 25 16:11:43 crc kubenswrapper[4937]: I0225 16:11:43.530647 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w8hgr" event={"ID":"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d","Type":"ContainerDied","Data":"04d2954c76170b54b664e31e54d5a6f8eeaa54b13b5ee8c72c544d7f49d75bda"} Feb 25 16:11:43 crc kubenswrapper[4937]: I0225 16:11:43.532695 4937 generic.go:334] "Generic (PLEG): container finished" podID="88c34e87-5116-4387-9e2b-e5fbcedb6f55" containerID="c3cf326623410f2d9aca12fc12f29fcb1221a82e4948c138eb02590abd9c8891" exitCode=0 Feb 25 16:11:43 crc kubenswrapper[4937]: I0225 16:11:43.532746 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4nvj6" event={"ID":"88c34e87-5116-4387-9e2b-e5fbcedb6f55","Type":"ContainerDied","Data":"c3cf326623410f2d9aca12fc12f29fcb1221a82e4948c138eb02590abd9c8891"} Feb 25 16:11:43 crc kubenswrapper[4937]: I0225 16:11:43.543586 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e19ac505-41d9-4d1d-b75a-0c88e26960c8","Type":"ContainerStarted","Data":"a6f706d7310d97b2900b971c494236f99adf6904c65548a724f429abe7365374"} Feb 25 16:11:43 crc kubenswrapper[4937]: I0225 16:11:43.588583 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.431714016 podStartE2EDuration="48.588565391s" podCreationTimestamp="2026-02-25 16:10:55 +0000 UTC" firstStartedPulling="2026-02-25 16:11:16.078706761 +0000 UTC m=+1527.092098651" lastFinishedPulling="2026-02-25 16:11:42.235558126 +0000 UTC m=+1553.248950026" observedRunningTime="2026-02-25 16:11:43.580325824 +0000 UTC m=+1554.593717714" watchObservedRunningTime="2026-02-25 16:11:43.588565391 +0000 UTC m=+1554.601957281" Feb 25 16:11:44 crc kubenswrapper[4937]: I0225 16:11:44.074416 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 25 16:11:44 crc kubenswrapper[4937]: I0225 16:11:44.555575 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"5d32b035f2889a508e2e200fe4c2fd2b99143f587d59e442af4ca1074872e960"} Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.008183 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4nvj6" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.132955 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bv42\" (UniqueName: \"kubernetes.io/projected/88c34e87-5116-4387-9e2b-e5fbcedb6f55-kube-api-access-2bv42\") pod \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\" (UID: \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\") " Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.132999 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88c34e87-5116-4387-9e2b-e5fbcedb6f55-config-data\") pod \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\" (UID: \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\") " Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.133161 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c34e87-5116-4387-9e2b-e5fbcedb6f55-combined-ca-bundle\") pod \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\" (UID: \"88c34e87-5116-4387-9e2b-e5fbcedb6f55\") " Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.139767 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88c34e87-5116-4387-9e2b-e5fbcedb6f55-kube-api-access-2bv42" (OuterVolumeSpecName: "kube-api-access-2bv42") pod "88c34e87-5116-4387-9e2b-e5fbcedb6f55" (UID: "88c34e87-5116-4387-9e2b-e5fbcedb6f55"). InnerVolumeSpecName "kube-api-access-2bv42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.142068 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.181867 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88c34e87-5116-4387-9e2b-e5fbcedb6f55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88c34e87-5116-4387-9e2b-e5fbcedb6f55" (UID: "88c34e87-5116-4387-9e2b-e5fbcedb6f55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.195508 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88c34e87-5116-4387-9e2b-e5fbcedb6f55-config-data" (OuterVolumeSpecName: "config-data") pod "88c34e87-5116-4387-9e2b-e5fbcedb6f55" (UID: "88c34e87-5116-4387-9e2b-e5fbcedb6f55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.235293 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-db-sync-config-data\") pod \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.235438 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll2cv\" (UniqueName: \"kubernetes.io/projected/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-kube-api-access-ll2cv\") pod \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.235542 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-combined-ca-bundle\") pod \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.235571 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-config-data\") pod \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\" (UID: \"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d\") " Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.237050 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c34e87-5116-4387-9e2b-e5fbcedb6f55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.237077 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bv42\" (UniqueName: \"kubernetes.io/projected/88c34e87-5116-4387-9e2b-e5fbcedb6f55-kube-api-access-2bv42\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.237091 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88c34e87-5116-4387-9e2b-e5fbcedb6f55-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.239061 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d" (UID: "8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.240838 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-kube-api-access-ll2cv" (OuterVolumeSpecName: "kube-api-access-ll2cv") pod "8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d" (UID: "8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d"). InnerVolumeSpecName "kube-api-access-ll2cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.264883 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d" (UID: "8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.286300 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-config-data" (OuterVolumeSpecName: "config-data") pod "8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d" (UID: "8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.338446 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll2cv\" (UniqueName: \"kubernetes.io/projected/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-kube-api-access-ll2cv\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.338501 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.338515 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.338526 4937 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.568792 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w8hgr" event={"ID":"8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d","Type":"ContainerDied","Data":"50315280550d3775f6167e28a01ad520521a45388978c5c68a446abbe8b4a515"} Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.568855 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50315280550d3775f6167e28a01ad520521a45388978c5c68a446abbe8b4a515" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.568951 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w8hgr" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.572991 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4nvj6" event={"ID":"88c34e87-5116-4387-9e2b-e5fbcedb6f55","Type":"ContainerDied","Data":"9632a20f817f50214cc1a6bdf43c1931d70e235e65aaecba6b6230df70ab7ca2"} Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.573021 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9632a20f817f50214cc1a6bdf43c1931d70e235e65aaecba6b6230df70ab7ca2" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.573069 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4nvj6" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.926831 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-bslrt"] Feb 25 16:11:45 crc kubenswrapper[4937]: E0225 16:11:45.927613 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fceee1fa-35a1-4b5d-aca8-054ab1816927" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.927640 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="fceee1fa-35a1-4b5d-aca8-054ab1816927" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: E0225 16:11:45.927660 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a0f0530-95e1-4231-9933-bedb49b72a88" containerName="swift-ring-rebalance" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.927668 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0f0530-95e1-4231-9933-bedb49b72a88" containerName="swift-ring-rebalance" Feb 25 16:11:45 crc kubenswrapper[4937]: E0225 16:11:45.927677 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca05068-9163-4e5f-abc8-c98462b3b6c8" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.927685 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca05068-9163-4e5f-abc8-c98462b3b6c8" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: E0225 16:11:45.927701 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231f1566-c91c-47f6-9ef5-5a9fbc5b0c57" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.927710 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="231f1566-c91c-47f6-9ef5-5a9fbc5b0c57" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: E0225 16:11:45.927723 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1373796-a25a-406b-a417-a26ff42bbce4" containerName="mariadb-database-create" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.927731 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1373796-a25a-406b-a417-a26ff42bbce4" containerName="mariadb-database-create" Feb 25 16:11:45 crc kubenswrapper[4937]: E0225 16:11:45.927742 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2827afe8-442b-4aa9-95f6-48ef3c9a3995" containerName="mariadb-database-create" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.927750 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2827afe8-442b-4aa9-95f6-48ef3c9a3995" containerName="mariadb-database-create" Feb 25 16:11:45 crc kubenswrapper[4937]: E0225 16:11:45.927761 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c517a3-e534-4a32-aaa1-23ad2d42fc10" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.927768 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c517a3-e534-4a32-aaa1-23ad2d42fc10" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: E0225 16:11:45.927784 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c34e87-5116-4387-9e2b-e5fbcedb6f55" containerName="keystone-db-sync" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.927792 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c34e87-5116-4387-9e2b-e5fbcedb6f55" containerName="keystone-db-sync" Feb 25 16:11:45 crc kubenswrapper[4937]: E0225 16:11:45.927808 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d" containerName="glance-db-sync" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.927816 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d" containerName="glance-db-sync" Feb 25 16:11:45 crc kubenswrapper[4937]: E0225 16:11:45.927825 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="296328ce-10bb-42d4-a3e6-3b4986e9b944" containerName="mariadb-database-create" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.927834 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="296328ce-10bb-42d4-a3e6-3b4986e9b944" containerName="mariadb-database-create" Feb 25 16:11:45 crc kubenswrapper[4937]: E0225 16:11:45.927850 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79be1a43-3786-4cf6-8127-016cb865312c" containerName="ovn-config" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.927859 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="79be1a43-3786-4cf6-8127-016cb865312c" containerName="ovn-config" Feb 25 16:11:45 crc kubenswrapper[4937]: E0225 16:11:45.927873 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d3da10-44f5-48ce-8279-6565217f5ab2" containerName="mariadb-database-create" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.927882 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d3da10-44f5-48ce-8279-6565217f5ab2" containerName="mariadb-database-create" Feb 25 16:11:45 crc kubenswrapper[4937]: E0225 16:11:45.927899 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f262d7a-313f-4cdc-8a9a-5a5765fb3da0" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.927907 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f262d7a-313f-4cdc-8a9a-5a5765fb3da0" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.928115 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="fceee1fa-35a1-4b5d-aca8-054ab1816927" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.928131 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1373796-a25a-406b-a417-a26ff42bbce4" containerName="mariadb-database-create" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.928143 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1c517a3-e534-4a32-aaa1-23ad2d42fc10" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.928155 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="296328ce-10bb-42d4-a3e6-3b4986e9b944" containerName="mariadb-database-create" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.928167 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="231f1566-c91c-47f6-9ef5-5a9fbc5b0c57" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.928178 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="79be1a43-3786-4cf6-8127-016cb865312c" containerName="ovn-config" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.928192 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca05068-9163-4e5f-abc8-c98462b3b6c8" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.928205 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d" containerName="glance-db-sync" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.928216 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="2827afe8-442b-4aa9-95f6-48ef3c9a3995" containerName="mariadb-database-create" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.928229 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a0f0530-95e1-4231-9933-bedb49b72a88" containerName="swift-ring-rebalance" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.928243 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f262d7a-313f-4cdc-8a9a-5a5765fb3da0" containerName="mariadb-account-create-update" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.928252 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d3da10-44f5-48ce-8279-6565217f5ab2" containerName="mariadb-database-create" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.928260 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c34e87-5116-4387-9e2b-e5fbcedb6f55" containerName="keystone-db-sync" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.930715 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.947822 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-bslrt"] Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.974884 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-config\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.974959 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.975000 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.975017 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhkh7\" (UniqueName: \"kubernetes.io/projected/cc191ad9-1ac9-4372-a31e-dabd1b174031-kube-api-access-nhkh7\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:45 crc kubenswrapper[4937]: I0225 16:11:45.975150 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.081278 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.081370 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-config\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.081429 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.081463 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.081491 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhkh7\" (UniqueName: \"kubernetes.io/projected/cc191ad9-1ac9-4372-a31e-dabd1b174031-kube-api-access-nhkh7\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.082594 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.083086 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-config\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.084671 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.084910 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.116524 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhkh7\" (UniqueName: \"kubernetes.io/projected/cc191ad9-1ac9-4372-a31e-dabd1b174031-kube-api-access-nhkh7\") pod \"dnsmasq-dns-5b946c75cc-bslrt\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.172775 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-x95b4"] Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.177555 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.186335 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.186663 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d9p7q" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.186990 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.187100 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.203478 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.217051 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x95b4"] Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.250929 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.257556 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-bslrt"] Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.280119 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.285026 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-fernet-keys\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.285069 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-scripts\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.285158 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-combined-ca-bundle\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.285206 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-credential-keys\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.285235 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-config-data\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.285259 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mnlg\" (UniqueName: \"kubernetes.io/projected/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-kube-api-access-2mnlg\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.352946 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-784f69c749-z6c5g"] Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.367976 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.396564 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-credential-keys\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.396800 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-config-data\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.396858 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mnlg\" (UniqueName: \"kubernetes.io/projected/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-kube-api-access-2mnlg\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.397159 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-fernet-keys\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.397201 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-scripts\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.398130 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-combined-ca-bundle\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.437141 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-config-data\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.464621 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-fernet-keys\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.465478 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-scripts\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.494635 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-z6c5g"] Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.496233 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-combined-ca-bundle\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.496572 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-credential-keys\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.504547 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mnlg\" (UniqueName: \"kubernetes.io/projected/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-kube-api-access-2mnlg\") pod \"keystone-bootstrap-x95b4\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.526498 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-config\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.526582 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-dns-svc\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.526731 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.526815 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqf2k\" (UniqueName: \"kubernetes.io/projected/6ac4754b-f29c-43cf-9bae-9902a5670866-kube-api-access-fqf2k\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.526918 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.540520 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-4vmm4"] Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.543160 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4vmm4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.565274 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.565730 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hb9w4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.574725 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.601982 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.624556 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4vmm4"] Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.631459 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.631533 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqf2k\" (UniqueName: \"kubernetes.io/projected/6ac4754b-f29c-43cf-9bae-9902a5670866-kube-api-access-fqf2k\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.631570 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.631638 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90764917-3dc9-4778-b224-67cb4ae1e49d-config\") pod \"neutron-db-sync-4vmm4\" (UID: \"90764917-3dc9-4778-b224-67cb4ae1e49d\") " pod="openstack/neutron-db-sync-4vmm4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.631673 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-config\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.631692 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t87p\" (UniqueName: \"kubernetes.io/projected/90764917-3dc9-4778-b224-67cb4ae1e49d-kube-api-access-4t87p\") pod \"neutron-db-sync-4vmm4\" (UID: \"90764917-3dc9-4778-b224-67cb4ae1e49d\") " pod="openstack/neutron-db-sync-4vmm4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.631717 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-dns-svc\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.631743 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90764917-3dc9-4778-b224-67cb4ae1e49d-combined-ca-bundle\") pod \"neutron-db-sync-4vmm4\" (UID: \"90764917-3dc9-4778-b224-67cb4ae1e49d\") " pod="openstack/neutron-db-sync-4vmm4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.632610 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.633333 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.633929 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-config\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.634397 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-dns-svc\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.683312 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6rpk2"] Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.686008 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqf2k\" (UniqueName: \"kubernetes.io/projected/6ac4754b-f29c-43cf-9bae-9902a5670866-kube-api-access-fqf2k\") pod \"dnsmasq-dns-784f69c749-z6c5g\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.689764 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6rpk2"] Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.689875 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.710548 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.710723 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kvzm6" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.710804 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.733416 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t87p\" (UniqueName: \"kubernetes.io/projected/90764917-3dc9-4778-b224-67cb4ae1e49d-kube-api-access-4t87p\") pod \"neutron-db-sync-4vmm4\" (UID: \"90764917-3dc9-4778-b224-67cb4ae1e49d\") " pod="openstack/neutron-db-sync-4vmm4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.733926 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/006fb5e7-a244-4758-8065-3615f5a2b9b7-etc-machine-id\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.734003 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90764917-3dc9-4778-b224-67cb4ae1e49d-combined-ca-bundle\") pod \"neutron-db-sync-4vmm4\" (UID: \"90764917-3dc9-4778-b224-67cb4ae1e49d\") " pod="openstack/neutron-db-sync-4vmm4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.734052 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-scripts\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.734175 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-db-sync-config-data\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.734223 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-config-data\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.734373 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90764917-3dc9-4778-b224-67cb4ae1e49d-config\") pod \"neutron-db-sync-4vmm4\" (UID: \"90764917-3dc9-4778-b224-67cb4ae1e49d\") " pod="openstack/neutron-db-sync-4vmm4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.734397 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp2c7\" (UniqueName: \"kubernetes.io/projected/006fb5e7-a244-4758-8065-3615f5a2b9b7-kube-api-access-zp2c7\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.734427 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-combined-ca-bundle\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.741989 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/90764917-3dc9-4778-b224-67cb4ae1e49d-config\") pod \"neutron-db-sync-4vmm4\" (UID: \"90764917-3dc9-4778-b224-67cb4ae1e49d\") " pod="openstack/neutron-db-sync-4vmm4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.743928 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90764917-3dc9-4778-b224-67cb4ae1e49d-combined-ca-bundle\") pod \"neutron-db-sync-4vmm4\" (UID: \"90764917-3dc9-4778-b224-67cb4ae1e49d\") " pod="openstack/neutron-db-sync-4vmm4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.744316 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.815416 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-b4zhr"] Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.816751 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.823779 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t87p\" (UniqueName: \"kubernetes.io/projected/90764917-3dc9-4778-b224-67cb4ae1e49d-kube-api-access-4t87p\") pod \"neutron-db-sync-4vmm4\" (UID: \"90764917-3dc9-4778-b224-67cb4ae1e49d\") " pod="openstack/neutron-db-sync-4vmm4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.845614 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.845785 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-5lx8b" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.845910 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.846008 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.847594 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp2c7\" (UniqueName: \"kubernetes.io/projected/006fb5e7-a244-4758-8065-3615f5a2b9b7-kube-api-access-zp2c7\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.847625 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-combined-ca-bundle\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.847658 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/006fb5e7-a244-4758-8065-3615f5a2b9b7-etc-machine-id\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.847691 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-scripts\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.847751 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-db-sync-config-data\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.847771 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-config-data\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.848161 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/006fb5e7-a244-4758-8065-3615f5a2b9b7-etc-machine-id\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.854965 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-combined-ca-bundle\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.866057 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4vmm4" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.868357 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-scripts\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.869040 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8tmw5"] Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.870416 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-db-sync-config-data\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.870712 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.887283 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-config-data\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.895377 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.895698 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2lh9m" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.895820 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.953466 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-config-data\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.976081 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r6xc\" (UniqueName: \"kubernetes.io/projected/8f988c32-d57e-4e63-add5-1e86a8818641-kube-api-access-6r6xc\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.976252 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/44849697-9b41-4439-b8c7-f497036543aa-certs\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.976332 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn6jr\" (UniqueName: \"kubernetes.io/projected/44849697-9b41-4439-b8c7-f497036543aa-kube-api-access-tn6jr\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.976442 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-scripts\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.976633 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-combined-ca-bundle\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.976720 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-config-data\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.976795 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-combined-ca-bundle\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.976912 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f988c32-d57e-4e63-add5-1e86a8818641-logs\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.977031 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-scripts\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:46 crc kubenswrapper[4937]: I0225 16:11:46.976125 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp2c7\" (UniqueName: \"kubernetes.io/projected/006fb5e7-a244-4758-8065-3615f5a2b9b7-kube-api-access-zp2c7\") pod \"cinder-db-sync-6rpk2\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.033052 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-b4zhr"] Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.071131 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.089620 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-config-data\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.089666 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-combined-ca-bundle\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.089709 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f988c32-d57e-4e63-add5-1e86a8818641-logs\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.089745 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-scripts\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.089799 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-config-data\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.089814 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r6xc\" (UniqueName: \"kubernetes.io/projected/8f988c32-d57e-4e63-add5-1e86a8818641-kube-api-access-6r6xc\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.089837 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn6jr\" (UniqueName: \"kubernetes.io/projected/44849697-9b41-4439-b8c7-f497036543aa-kube-api-access-tn6jr\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.089854 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/44849697-9b41-4439-b8c7-f497036543aa-certs\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.089875 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-scripts\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.089922 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-combined-ca-bundle\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.098438 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/44849697-9b41-4439-b8c7-f497036543aa-certs\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.099105 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-combined-ca-bundle\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.099342 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f988c32-d57e-4e63-add5-1e86a8818641-logs\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.115472 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-z6c5g"] Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.118696 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-scripts\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.131931 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-config-data\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.133214 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-scripts\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.138672 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-config-data\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.154762 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-combined-ca-bundle\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.167925 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r6xc\" (UniqueName: \"kubernetes.io/projected/8f988c32-d57e-4e63-add5-1e86a8818641-kube-api-access-6r6xc\") pod \"placement-db-sync-8tmw5\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.169211 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn6jr\" (UniqueName: \"kubernetes.io/projected/44849697-9b41-4439-b8c7-f497036543aa-kube-api-access-tn6jr\") pod \"cloudkitty-db-sync-b4zhr\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.213675 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.216292 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.219460 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.222139 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.239461 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.265591 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8tmw5"] Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.292397 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.298444 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-scripts\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.298517 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ef5e4a5-46f1-4f72-ab91-699865d33243-log-httpd\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.298554 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-config-data\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.298585 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.298602 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgg8j\" (UniqueName: \"kubernetes.io/projected/8ef5e4a5-46f1-4f72-ab91-699865d33243-kube-api-access-hgg8j\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.298616 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.298691 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ef5e4a5-46f1-4f72-ab91-699865d33243-run-httpd\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.322203 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-7w4sz"] Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.323697 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7w4sz" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.343016 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.343314 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5w7rh" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.394432 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-9tdjb"] Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.394590 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8tmw5" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.396512 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.400157 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5cb6\" (UniqueName: \"kubernetes.io/projected/38a537ec-7743-44bd-b428-fa52adf39305-kube-api-access-z5cb6\") pod \"barbican-db-sync-7w4sz\" (UID: \"38a537ec-7743-44bd-b428-fa52adf39305\") " pod="openstack/barbican-db-sync-7w4sz" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.400204 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-scripts\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.400250 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ef5e4a5-46f1-4f72-ab91-699865d33243-log-httpd\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.400281 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-config-data\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.400306 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.400325 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgg8j\" (UniqueName: \"kubernetes.io/projected/8ef5e4a5-46f1-4f72-ab91-699865d33243-kube-api-access-hgg8j\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.400342 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.400377 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a537ec-7743-44bd-b428-fa52adf39305-combined-ca-bundle\") pod \"barbican-db-sync-7w4sz\" (UID: \"38a537ec-7743-44bd-b428-fa52adf39305\") " pod="openstack/barbican-db-sync-7w4sz" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.400449 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ef5e4a5-46f1-4f72-ab91-699865d33243-run-httpd\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.400541 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38a537ec-7743-44bd-b428-fa52adf39305-db-sync-config-data\") pod \"barbican-db-sync-7w4sz\" (UID: \"38a537ec-7743-44bd-b428-fa52adf39305\") " pod="openstack/barbican-db-sync-7w4sz" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.405177 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ef5e4a5-46f1-4f72-ab91-699865d33243-log-httpd\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.413518 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ef5e4a5-46f1-4f72-ab91-699865d33243-run-httpd\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.413619 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7w4sz"] Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.414680 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-scripts\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.421332 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-config-data\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.422219 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.433840 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.441841 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgg8j\" (UniqueName: \"kubernetes.io/projected/8ef5e4a5-46f1-4f72-ab91-699865d33243-kube-api-access-hgg8j\") pod \"ceilometer-0\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.467998 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-9tdjb"] Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.504168 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.504286 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a537ec-7743-44bd-b428-fa52adf39305-combined-ca-bundle\") pod \"barbican-db-sync-7w4sz\" (UID: \"38a537ec-7743-44bd-b428-fa52adf39305\") " pod="openstack/barbican-db-sync-7w4sz" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.504324 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-dns-svc\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.504366 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.504389 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-config\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.504445 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38a537ec-7743-44bd-b428-fa52adf39305-db-sync-config-data\") pod \"barbican-db-sync-7w4sz\" (UID: \"38a537ec-7743-44bd-b428-fa52adf39305\") " pod="openstack/barbican-db-sync-7w4sz" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.504465 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68vbw\" (UniqueName: \"kubernetes.io/projected/8aa18c37-6541-40d6-97b1-007a024605ec-kube-api-access-68vbw\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.504516 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5cb6\" (UniqueName: \"kubernetes.io/projected/38a537ec-7743-44bd-b428-fa52adf39305-kube-api-access-z5cb6\") pod \"barbican-db-sync-7w4sz\" (UID: \"38a537ec-7743-44bd-b428-fa52adf39305\") " pod="openstack/barbican-db-sync-7w4sz" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.514864 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38a537ec-7743-44bd-b428-fa52adf39305-db-sync-config-data\") pod \"barbican-db-sync-7w4sz\" (UID: \"38a537ec-7743-44bd-b428-fa52adf39305\") " pod="openstack/barbican-db-sync-7w4sz" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.530746 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.542159 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5cb6\" (UniqueName: \"kubernetes.io/projected/38a537ec-7743-44bd-b428-fa52adf39305-kube-api-access-z5cb6\") pod \"barbican-db-sync-7w4sz\" (UID: \"38a537ec-7743-44bd-b428-fa52adf39305\") " pod="openstack/barbican-db-sync-7w4sz" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.550748 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.554188 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.555879 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-86t29" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.556148 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.574964 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a537ec-7743-44bd-b428-fa52adf39305-combined-ca-bundle\") pod \"barbican-db-sync-7w4sz\" (UID: \"38a537ec-7743-44bd-b428-fa52adf39305\") " pod="openstack/barbican-db-sync-7w4sz" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.574976 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.606277 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.606344 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-config\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.606394 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.606417 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68vbw\" (UniqueName: \"kubernetes.io/projected/8aa18c37-6541-40d6-97b1-007a024605ec-kube-api-access-68vbw\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.606448 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.606468 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcsnq\" (UniqueName: \"kubernetes.io/projected/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-kube-api-access-gcsnq\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.606566 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-config-data\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.606599 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.615672 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-scripts\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.615742 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.615874 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-logs\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.615959 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-dns-svc\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.614441 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.612448 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-config\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.614794 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.618940 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-dns-svc\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.632239 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.642770 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.644593 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.648003 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.654413 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7w4sz" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.655601 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68vbw\" (UniqueName: \"kubernetes.io/projected/8aa18c37-6541-40d6-97b1-007a024605ec-kube-api-access-68vbw\") pod \"dnsmasq-dns-f84976bdf-9tdjb\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.658145 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"9fa1e8e11b02fc983f900e0e475dd93f840554c248677f1c492fdfc6801a214f"} Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.692994 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.719304 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.719356 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcsnq\" (UniqueName: \"kubernetes.io/projected/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-kube-api-access-gcsnq\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.719403 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-config-data\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.719451 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-scripts\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.719476 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.719525 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-logs\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.719597 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.720941 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.721394 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-logs\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.727069 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-scripts\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.727200 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.727263 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/75804d914e821b5e0a9ece92cf1b2b0f3da08753d3294f4ad4199fba37b189f7/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.728381 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.744756 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-config-data\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.751691 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcsnq\" (UniqueName: \"kubernetes.io/projected/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-kube-api-access-gcsnq\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.762382 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.798737 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-bslrt"] Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.827210 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.827305 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvjht\" (UniqueName: \"kubernetes.io/projected/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-kube-api-access-xvjht\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.827334 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.827378 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.827439 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.827499 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-logs\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.827546 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.840630 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") pod \"glance-default-external-api-0\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: W0225 16:11:47.847205 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc191ad9_1ac9_4372_a31e_dabd1b174031.slice/crio-55237e368b6993b2863198fc8a42fd6fd5b1a4d2d2ee4ab80026bd64c1749534 WatchSource:0}: Error finding container 55237e368b6993b2863198fc8a42fd6fd5b1a4d2d2ee4ab80026bd64c1749534: Status 404 returned error can't find the container with id 55237e368b6993b2863198fc8a42fd6fd5b1a4d2d2ee4ab80026bd64c1749534 Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.895530 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4vmm4"] Feb 25 16:11:47 crc kubenswrapper[4937]: W0225 16:11:47.909087 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90764917_3dc9_4778_b224_67cb4ae1e49d.slice/crio-9f5c3312593d86df03e5320ca5220e3801b7aeb0e031908f1259136690d83463 WatchSource:0}: Error finding container 9f5c3312593d86df03e5320ca5220e3801b7aeb0e031908f1259136690d83463: Status 404 returned error can't find the container with id 9f5c3312593d86df03e5320ca5220e3801b7aeb0e031908f1259136690d83463 Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.915306 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.930255 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.930341 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-logs\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.930398 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.930441 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.930466 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvjht\" (UniqueName: \"kubernetes.io/projected/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-kube-api-access-xvjht\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.930520 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.930591 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.932431 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.932790 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-logs\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.938721 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.940794 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.941775 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.956656 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.956704 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/53a683be7e1f0cfe8980b8900c6ef26fb8068fb3b32445402dc99b2c4e60848d/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 25 16:11:47 crc kubenswrapper[4937]: I0225 16:11:47.977713 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvjht\" (UniqueName: \"kubernetes.io/projected/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-kube-api-access-xvjht\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.070348 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") pod \"glance-default-internal-api-0\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.112702 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6rpk2"] Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.164206 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-z6c5g"] Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.296070 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.339713 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x95b4"] Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.361739 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8tmw5"] Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.395519 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-b4zhr"] Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.570017 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7w4sz"] Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.677328 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4vmm4" event={"ID":"90764917-3dc9-4778-b224-67cb4ae1e49d","Type":"ContainerStarted","Data":"f828440b21ea18ae8263a8d3f11bc51cf5f99ec4284f45e31327cfe60dba52ee"} Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.677726 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4vmm4" event={"ID":"90764917-3dc9-4778-b224-67cb4ae1e49d","Type":"ContainerStarted","Data":"9f5c3312593d86df03e5320ca5220e3801b7aeb0e031908f1259136690d83463"} Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.678787 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-b4zhr" event={"ID":"44849697-9b41-4439-b8c7-f497036543aa","Type":"ContainerStarted","Data":"43b2a7c2ab5c47d272d4eb451a698820cdbc1a492e4f98268bf7165d67dc0ad4"} Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.692581 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"4a965205853570c110b19bfcc5d92d4c4a9633932526b3a14f6a8327b92e9027"} Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.692635 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"82b9bb63f2d2d6711cb8e98feb00b2ce96d1247c2ac498b0d46a7609aa74daea"} Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.697286 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6rpk2" event={"ID":"006fb5e7-a244-4758-8065-3615f5a2b9b7","Type":"ContainerStarted","Data":"545b80e17fb664f28caed5eeb37b07fc6940230f941b48a47f04d1d5e8789ca2"} Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.698639 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-z6c5g" event={"ID":"6ac4754b-f29c-43cf-9bae-9902a5670866","Type":"ContainerStarted","Data":"d4ef5f9c8843abff9803e37aade4136b04a3bc5d5b0f55862dd8b647334720f7"} Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.699959 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x95b4" event={"ID":"aa5bc946-0e69-40ae-a76e-06e9d477bcaf","Type":"ContainerStarted","Data":"9f5121da44b26ec2522fb7b115d911111e69754a759a9b94350d9d6eb60d8bb3"} Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.701399 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" event={"ID":"cc191ad9-1ac9-4372-a31e-dabd1b174031","Type":"ContainerStarted","Data":"708d7b19655cfb9a43eb6f041882b4b58e58da91adfbc3c84aa972acba25b828"} Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.701421 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" event={"ID":"cc191ad9-1ac9-4372-a31e-dabd1b174031","Type":"ContainerStarted","Data":"55237e368b6993b2863198fc8a42fd6fd5b1a4d2d2ee4ab80026bd64c1749534"} Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.702646 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8tmw5" event={"ID":"8f988c32-d57e-4e63-add5-1e86a8818641","Type":"ContainerStarted","Data":"1cd7f9326358cb1c4c96117b56a0d9c188c1e63bba939730c335a45ddd9ba232"} Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.706813 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7w4sz" event={"ID":"38a537ec-7743-44bd-b428-fa52adf39305","Type":"ContainerStarted","Data":"7c642085466985ced2df8bee6d25317f1128749d0efbef394ae553a98c3d1e76"} Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.973284 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:11:48 crc kubenswrapper[4937]: I0225 16:11:48.989025 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-9tdjb"] Feb 25 16:11:49 crc kubenswrapper[4937]: I0225 16:11:49.294184 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:11:49 crc kubenswrapper[4937]: W0225 16:11:49.340668 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84dfe2e8_5a8a_4fad_ad83_3db116c23bd8.slice/crio-ae7d74e887da4e8d8f4b610f50ce2c8f0601beb840b386cbccfaefe5416c5bed WatchSource:0}: Error finding container ae7d74e887da4e8d8f4b610f50ce2c8f0601beb840b386cbccfaefe5416c5bed: Status 404 returned error can't find the container with id ae7d74e887da4e8d8f4b610f50ce2c8f0601beb840b386cbccfaefe5416c5bed Feb 25 16:11:49 crc kubenswrapper[4937]: I0225 16:11:49.497292 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:11:49 crc kubenswrapper[4937]: I0225 16:11:49.572712 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:11:49 crc kubenswrapper[4937]: I0225 16:11:49.694192 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:11:49 crc kubenswrapper[4937]: I0225 16:11:49.730977 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x95b4" event={"ID":"aa5bc946-0e69-40ae-a76e-06e9d477bcaf","Type":"ContainerStarted","Data":"62353a78138d2688a31e712416ee99ca54669f43d67873016bea95f2f68021f6"} Feb 25 16:11:49 crc kubenswrapper[4937]: I0225 16:11:49.770163 4937 generic.go:334] "Generic (PLEG): container finished" podID="cc191ad9-1ac9-4372-a31e-dabd1b174031" containerID="708d7b19655cfb9a43eb6f041882b4b58e58da91adfbc3c84aa972acba25b828" exitCode=0 Feb 25 16:11:49 crc kubenswrapper[4937]: I0225 16:11:49.770237 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" event={"ID":"cc191ad9-1ac9-4372-a31e-dabd1b174031","Type":"ContainerDied","Data":"708d7b19655cfb9a43eb6f041882b4b58e58da91adfbc3c84aa972acba25b828"} Feb 25 16:11:49 crc kubenswrapper[4937]: I0225 16:11:49.781515 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-x95b4" podStartSLOduration=3.781491452 podStartE2EDuration="3.781491452s" podCreationTimestamp="2026-02-25 16:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:49.765641715 +0000 UTC m=+1560.779033605" watchObservedRunningTime="2026-02-25 16:11:49.781491452 +0000 UTC m=+1560.794883342" Feb 25 16:11:49 crc kubenswrapper[4937]: I0225 16:11:49.785759 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" event={"ID":"8aa18c37-6541-40d6-97b1-007a024605ec","Type":"ContainerStarted","Data":"eaf086c05cd4587e2aa4e67d0df0c7923001ac4d445fec99dcb2afa8c643215a"} Feb 25 16:11:49 crc kubenswrapper[4937]: I0225 16:11:49.787169 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ef5e4a5-46f1-4f72-ab91-699865d33243","Type":"ContainerStarted","Data":"c4beda9d4e738b69c4df08c3b7dd682f48d63081659c82a5b48be8d24813114f"} Feb 25 16:11:49 crc kubenswrapper[4937]: I0225 16:11:49.808861 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8","Type":"ContainerStarted","Data":"ae7d74e887da4e8d8f4b610f50ce2c8f0601beb840b386cbccfaefe5416c5bed"} Feb 25 16:11:49 crc kubenswrapper[4937]: I0225 16:11:49.832165 4937 generic.go:334] "Generic (PLEG): container finished" podID="6ac4754b-f29c-43cf-9bae-9902a5670866" containerID="5aa6978a0ba496d267d5965c87f6c119566076d0c3fd5dd241660c92324f4796" exitCode=0 Feb 25 16:11:49 crc kubenswrapper[4937]: I0225 16:11:49.834167 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-z6c5g" event={"ID":"6ac4754b-f29c-43cf-9bae-9902a5670866","Type":"ContainerDied","Data":"5aa6978a0ba496d267d5965c87f6c119566076d0c3fd5dd241660c92324f4796"} Feb 25 16:11:49 crc kubenswrapper[4937]: I0225 16:11:49.886714 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-4vmm4" podStartSLOduration=3.886680879 podStartE2EDuration="3.886680879s" podCreationTimestamp="2026-02-25 16:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:49.847566198 +0000 UTC m=+1560.860958088" watchObservedRunningTime="2026-02-25 16:11:49.886680879 +0000 UTC m=+1560.900072779" Feb 25 16:11:50 crc kubenswrapper[4937]: I0225 16:11:50.076138 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.815749 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.822527 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.905859 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.906769 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-bslrt" event={"ID":"cc191ad9-1ac9-4372-a31e-dabd1b174031","Type":"ContainerDied","Data":"55237e368b6993b2863198fc8a42fd6fd5b1a4d2d2ee4ab80026bd64c1749534"} Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.906863 4937 scope.go:117] "RemoveContainer" containerID="708d7b19655cfb9a43eb6f041882b4b58e58da91adfbc3c84aa972acba25b828" Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.965495 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-z6c5g" event={"ID":"6ac4754b-f29c-43cf-9bae-9902a5670866","Type":"ContainerDied","Data":"d4ef5f9c8843abff9803e37aade4136b04a3bc5d5b0f55862dd8b647334720f7"} Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.965603 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-z6c5g" Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.968253 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925","Type":"ContainerStarted","Data":"da80e89336e40e4f085741cd084deea9ec71098237e2ac72923db40c5d23ce82"} Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.977895 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhkh7\" (UniqueName: \"kubernetes.io/projected/cc191ad9-1ac9-4372-a31e-dabd1b174031-kube-api-access-nhkh7\") pod \"cc191ad9-1ac9-4372-a31e-dabd1b174031\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.977964 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-ovsdbserver-nb\") pod \"cc191ad9-1ac9-4372-a31e-dabd1b174031\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.978022 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-config\") pod \"cc191ad9-1ac9-4372-a31e-dabd1b174031\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.978052 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-config\") pod \"6ac4754b-f29c-43cf-9bae-9902a5670866\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.978086 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-dns-svc\") pod \"cc191ad9-1ac9-4372-a31e-dabd1b174031\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.978115 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-ovsdbserver-sb\") pod \"6ac4754b-f29c-43cf-9bae-9902a5670866\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.978159 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-ovsdbserver-nb\") pod \"6ac4754b-f29c-43cf-9bae-9902a5670866\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.978213 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-ovsdbserver-sb\") pod \"cc191ad9-1ac9-4372-a31e-dabd1b174031\" (UID: \"cc191ad9-1ac9-4372-a31e-dabd1b174031\") " Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.978244 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqf2k\" (UniqueName: \"kubernetes.io/projected/6ac4754b-f29c-43cf-9bae-9902a5670866-kube-api-access-fqf2k\") pod \"6ac4754b-f29c-43cf-9bae-9902a5670866\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " Feb 25 16:11:51 crc kubenswrapper[4937]: I0225 16:11:51.978287 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-dns-svc\") pod \"6ac4754b-f29c-43cf-9bae-9902a5670866\" (UID: \"6ac4754b-f29c-43cf-9bae-9902a5670866\") " Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.007962 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc191ad9-1ac9-4372-a31e-dabd1b174031-kube-api-access-nhkh7" (OuterVolumeSpecName: "kube-api-access-nhkh7") pod "cc191ad9-1ac9-4372-a31e-dabd1b174031" (UID: "cc191ad9-1ac9-4372-a31e-dabd1b174031"). InnerVolumeSpecName "kube-api-access-nhkh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.031064 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac4754b-f29c-43cf-9bae-9902a5670866-kube-api-access-fqf2k" (OuterVolumeSpecName: "kube-api-access-fqf2k") pod "6ac4754b-f29c-43cf-9bae-9902a5670866" (UID: "6ac4754b-f29c-43cf-9bae-9902a5670866"). InnerVolumeSpecName "kube-api-access-fqf2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.055311 4937 scope.go:117] "RemoveContainer" containerID="5aa6978a0ba496d267d5965c87f6c119566076d0c3fd5dd241660c92324f4796" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.074320 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6ac4754b-f29c-43cf-9bae-9902a5670866" (UID: "6ac4754b-f29c-43cf-9bae-9902a5670866"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.092169 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhkh7\" (UniqueName: \"kubernetes.io/projected/cc191ad9-1ac9-4372-a31e-dabd1b174031-kube-api-access-nhkh7\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.092197 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.092207 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqf2k\" (UniqueName: \"kubernetes.io/projected/6ac4754b-f29c-43cf-9bae-9902a5670866-kube-api-access-fqf2k\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.096238 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc191ad9-1ac9-4372-a31e-dabd1b174031" (UID: "cc191ad9-1ac9-4372-a31e-dabd1b174031"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.129049 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6ac4754b-f29c-43cf-9bae-9902a5670866" (UID: "6ac4754b-f29c-43cf-9bae-9902a5670866"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.154019 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-config" (OuterVolumeSpecName: "config") pod "6ac4754b-f29c-43cf-9bae-9902a5670866" (UID: "6ac4754b-f29c-43cf-9bae-9902a5670866"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.161645 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cc191ad9-1ac9-4372-a31e-dabd1b174031" (UID: "cc191ad9-1ac9-4372-a31e-dabd1b174031"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.163196 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cc191ad9-1ac9-4372-a31e-dabd1b174031" (UID: "cc191ad9-1ac9-4372-a31e-dabd1b174031"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.166896 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-config" (OuterVolumeSpecName: "config") pod "cc191ad9-1ac9-4372-a31e-dabd1b174031" (UID: "cc191ad9-1ac9-4372-a31e-dabd1b174031"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.172272 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ac4754b-f29c-43cf-9bae-9902a5670866" (UID: "6ac4754b-f29c-43cf-9bae-9902a5670866"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.197496 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.197532 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.197543 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.197552 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.197577 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.197588 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc191ad9-1ac9-4372-a31e-dabd1b174031-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.197595 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ac4754b-f29c-43cf-9bae-9902a5670866-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.269411 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-bslrt"] Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.278533 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-bslrt"] Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.374370 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-z6c5g"] Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.397655 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-z6c5g"] Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.980310 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"94da75bc821dae4b4d89d64b4f72b55dd35dfc7f770a5b3ba3259bee6f864949"} Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.982906 4937 generic.go:334] "Generic (PLEG): container finished" podID="8aa18c37-6541-40d6-97b1-007a024605ec" containerID="e7414a68c6f534c57664fa2f6642ceb9b67135598d756b32a16ce3b14b78a418" exitCode=0 Feb 25 16:11:52 crc kubenswrapper[4937]: I0225 16:11:52.982956 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" event={"ID":"8aa18c37-6541-40d6-97b1-007a024605ec","Type":"ContainerDied","Data":"e7414a68c6f534c57664fa2f6642ceb9b67135598d756b32a16ce3b14b78a418"} Feb 25 16:11:53 crc kubenswrapper[4937]: I0225 16:11:53.379613 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac4754b-f29c-43cf-9bae-9902a5670866" path="/var/lib/kubelet/pods/6ac4754b-f29c-43cf-9bae-9902a5670866/volumes" Feb 25 16:11:53 crc kubenswrapper[4937]: I0225 16:11:53.381143 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc191ad9-1ac9-4372-a31e-dabd1b174031" path="/var/lib/kubelet/pods/cc191ad9-1ac9-4372-a31e-dabd1b174031/volumes" Feb 25 16:11:54 crc kubenswrapper[4937]: I0225 16:11:54.008962 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8","Type":"ContainerStarted","Data":"b5892b819d2cbe4c4048f7f1bc8d670c5a45525ba43ae5a70eccd31d9c0df8aa"} Feb 25 16:11:54 crc kubenswrapper[4937]: I0225 16:11:54.023329 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925","Type":"ContainerStarted","Data":"9c4c47f7ad325d62685685ac806ca0d0f52131c5f2fb11a44098758d49c54951"} Feb 25 16:11:54 crc kubenswrapper[4937]: I0225 16:11:54.028872 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" event={"ID":"8aa18c37-6541-40d6-97b1-007a024605ec","Type":"ContainerStarted","Data":"853b2165860ad0d6189ee4fb9db66b1cc68e7450c867432d6a46cf550228b6b2"} Feb 25 16:11:54 crc kubenswrapper[4937]: I0225 16:11:54.029008 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:11:54 crc kubenswrapper[4937]: I0225 16:11:54.047542 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" podStartSLOduration=8.047520834 podStartE2EDuration="8.047520834s" podCreationTimestamp="2026-02-25 16:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:54.047451212 +0000 UTC m=+1565.060843102" watchObservedRunningTime="2026-02-25 16:11:54.047520834 +0000 UTC m=+1565.060912724" Feb 25 16:11:55 crc kubenswrapper[4937]: I0225 16:11:55.100077 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925","Type":"ContainerStarted","Data":"b6da8204356ff39a00ea99d54dc40b9b8636f0cc59c85f0911e2068123d6ba93"} Feb 25 16:11:55 crc kubenswrapper[4937]: I0225 16:11:55.100190 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" containerName="glance-log" containerID="cri-o://9c4c47f7ad325d62685685ac806ca0d0f52131c5f2fb11a44098758d49c54951" gracePeriod=30 Feb 25 16:11:55 crc kubenswrapper[4937]: I0225 16:11:55.100907 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" containerName="glance-httpd" containerID="cri-o://b6da8204356ff39a00ea99d54dc40b9b8636f0cc59c85f0911e2068123d6ba93" gracePeriod=30 Feb 25 16:11:55 crc kubenswrapper[4937]: I0225 16:11:55.116630 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8","Type":"ContainerStarted","Data":"ae00e1d43fdb53cf7f2889ee0de867d00e6d545940c114f95c7fc3b3bcbcd53e"} Feb 25 16:11:55 crc kubenswrapper[4937]: I0225 16:11:55.116936 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" containerName="glance-log" containerID="cri-o://b5892b819d2cbe4c4048f7f1bc8d670c5a45525ba43ae5a70eccd31d9c0df8aa" gracePeriod=30 Feb 25 16:11:55 crc kubenswrapper[4937]: I0225 16:11:55.117083 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" containerName="glance-httpd" containerID="cri-o://ae00e1d43fdb53cf7f2889ee0de867d00e6d545940c114f95c7fc3b3bcbcd53e" gracePeriod=30 Feb 25 16:11:55 crc kubenswrapper[4937]: I0225 16:11:55.160749 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.160730758 podStartE2EDuration="9.160730758s" podCreationTimestamp="2026-02-25 16:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:55.135628279 +0000 UTC m=+1566.149020169" watchObservedRunningTime="2026-02-25 16:11:55.160730758 +0000 UTC m=+1566.174122648" Feb 25 16:11:55 crc kubenswrapper[4937]: I0225 16:11:55.177978 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.17795993 podStartE2EDuration="9.17795993s" podCreationTimestamp="2026-02-25 16:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:11:55.164599545 +0000 UTC m=+1566.177991445" watchObservedRunningTime="2026-02-25 16:11:55.17795993 +0000 UTC m=+1566.191351820" Feb 25 16:11:56 crc kubenswrapper[4937]: I0225 16:11:56.137551 4937 generic.go:334] "Generic (PLEG): container finished" podID="7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" containerID="b6da8204356ff39a00ea99d54dc40b9b8636f0cc59c85f0911e2068123d6ba93" exitCode=0 Feb 25 16:11:56 crc kubenswrapper[4937]: I0225 16:11:56.138176 4937 generic.go:334] "Generic (PLEG): container finished" podID="7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" containerID="9c4c47f7ad325d62685685ac806ca0d0f52131c5f2fb11a44098758d49c54951" exitCode=143 Feb 25 16:11:56 crc kubenswrapper[4937]: I0225 16:11:56.138248 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925","Type":"ContainerDied","Data":"b6da8204356ff39a00ea99d54dc40b9b8636f0cc59c85f0911e2068123d6ba93"} Feb 25 16:11:56 crc kubenswrapper[4937]: I0225 16:11:56.138280 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925","Type":"ContainerDied","Data":"9c4c47f7ad325d62685685ac806ca0d0f52131c5f2fb11a44098758d49c54951"} Feb 25 16:11:56 crc kubenswrapper[4937]: I0225 16:11:56.143549 4937 generic.go:334] "Generic (PLEG): container finished" podID="84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" containerID="ae00e1d43fdb53cf7f2889ee0de867d00e6d545940c114f95c7fc3b3bcbcd53e" exitCode=0 Feb 25 16:11:56 crc kubenswrapper[4937]: I0225 16:11:56.143586 4937 generic.go:334] "Generic (PLEG): container finished" podID="84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" containerID="b5892b819d2cbe4c4048f7f1bc8d670c5a45525ba43ae5a70eccd31d9c0df8aa" exitCode=143 Feb 25 16:11:56 crc kubenswrapper[4937]: I0225 16:11:56.143677 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8","Type":"ContainerDied","Data":"ae00e1d43fdb53cf7f2889ee0de867d00e6d545940c114f95c7fc3b3bcbcd53e"} Feb 25 16:11:56 crc kubenswrapper[4937]: I0225 16:11:56.143713 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8","Type":"ContainerDied","Data":"b5892b819d2cbe4c4048f7f1bc8d670c5a45525ba43ae5a70eccd31d9c0df8aa"} Feb 25 16:11:56 crc kubenswrapper[4937]: I0225 16:11:56.241579 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 25 16:11:56 crc kubenswrapper[4937]: I0225 16:11:56.267591 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.231650 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925","Type":"ContainerDied","Data":"da80e89336e40e4f085741cd084deea9ec71098237e2ac72923db40c5d23ce82"} Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.231985 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da80e89336e40e4f085741cd084deea9ec71098237e2ac72923db40c5d23ce82" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.255503 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8","Type":"ContainerDied","Data":"ae7d74e887da4e8d8f4b610f50ce2c8f0601beb840b386cbccfaefe5416c5bed"} Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.255556 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae7d74e887da4e8d8f4b610f50ce2c8f0601beb840b386cbccfaefe5416c5bed" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.262336 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.472968 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.532876 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.565987 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") pod \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.566069 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-httpd-run\") pod \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.566092 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcsnq\" (UniqueName: \"kubernetes.io/projected/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-kube-api-access-gcsnq\") pod \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.566123 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-scripts\") pod \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.566200 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-scripts\") pod \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.566218 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-logs\") pod \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.566234 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-config-data\") pod \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.566253 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-combined-ca-bundle\") pod \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.566288 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-config-data\") pod \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.566313 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvjht\" (UniqueName: \"kubernetes.io/projected/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-kube-api-access-xvjht\") pod \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.566401 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-combined-ca-bundle\") pod \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\" (UID: \"7fbbb4f6-1ce2-4663-a3bb-a8bd16630925\") " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.566440 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-httpd-run\") pod \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.566508 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") pod \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.566544 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-logs\") pod \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\" (UID: \"84dfe2e8-5a8a-4fad-ad83-3db116c23bd8\") " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.568322 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-logs" (OuterVolumeSpecName: "logs") pod "84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" (UID: "84dfe2e8-5a8a-4fad-ad83-3db116c23bd8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.568957 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" (UID: "7fbbb4f6-1ce2-4663-a3bb-a8bd16630925"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.570856 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-logs" (OuterVolumeSpecName: "logs") pod "7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" (UID: "7fbbb4f6-1ce2-4663-a3bb-a8bd16630925"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.572792 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" (UID: "84dfe2e8-5a8a-4fad-ad83-3db116c23bd8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.584843 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-scripts" (OuterVolumeSpecName: "scripts") pod "7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" (UID: "7fbbb4f6-1ce2-4663-a3bb-a8bd16630925"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.602119 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-scripts" (OuterVolumeSpecName: "scripts") pod "84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" (UID: "84dfe2e8-5a8a-4fad-ad83-3db116c23bd8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.605318 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-kube-api-access-gcsnq" (OuterVolumeSpecName: "kube-api-access-gcsnq") pod "84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" (UID: "84dfe2e8-5a8a-4fad-ad83-3db116c23bd8"). InnerVolumeSpecName "kube-api-access-gcsnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.620727 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-kube-api-access-xvjht" (OuterVolumeSpecName: "kube-api-access-xvjht") pod "7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" (UID: "7fbbb4f6-1ce2-4663-a3bb-a8bd16630925"). InnerVolumeSpecName "kube-api-access-xvjht". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.627390 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582" (OuterVolumeSpecName: "glance") pod "7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" (UID: "7fbbb4f6-1ce2-4663-a3bb-a8bd16630925"). InnerVolumeSpecName "pvc-a6cbd030-8970-4e53-96ea-5336de37e582". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.647318 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6" (OuterVolumeSpecName: "glance") pod "84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" (UID: "84dfe2e8-5a8a-4fad-ad83-3db116c23bd8"). InnerVolumeSpecName "pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.649782 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" (UID: "84dfe2e8-5a8a-4fad-ad83-3db116c23bd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.659650 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" (UID: "7fbbb4f6-1ce2-4663-a3bb-a8bd16630925"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.669222 4937 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") on node \"crc\" " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.669256 4937 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.669273 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcsnq\" (UniqueName: \"kubernetes.io/projected/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-kube-api-access-gcsnq\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.669287 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.669297 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.669311 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.669322 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.669333 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvjht\" (UniqueName: \"kubernetes.io/projected/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-kube-api-access-xvjht\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.669345 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.669358 4937 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.669378 4937 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") on node \"crc\" " Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.669395 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.685589 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-config-data" (OuterVolumeSpecName: "config-data") pod "84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" (UID: "84dfe2e8-5a8a-4fad-ad83-3db116c23bd8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.706423 4937 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.706594 4937 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6") on node "crc" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.711603 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-config-data" (OuterVolumeSpecName: "config-data") pod "7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" (UID: "7fbbb4f6-1ce2-4663-a3bb-a8bd16630925"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.726798 4937 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.727043 4937 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a6cbd030-8970-4e53-96ea-5336de37e582" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582") on node "crc" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.772747 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.772782 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.772794 4937 reconciler_common.go:293] "Volume detached for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:57 crc kubenswrapper[4937]: I0225 16:11:57.772804 4937 reconciler_common.go:293] "Volume detached for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") on node \"crc\" DevicePath \"\"" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.270268 4937 generic.go:334] "Generic (PLEG): container finished" podID="aa5bc946-0e69-40ae-a76e-06e9d477bcaf" containerID="62353a78138d2688a31e712416ee99ca54669f43d67873016bea95f2f68021f6" exitCode=0 Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.270502 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x95b4" event={"ID":"aa5bc946-0e69-40ae-a76e-06e9d477bcaf","Type":"ContainerDied","Data":"62353a78138d2688a31e712416ee99ca54669f43d67873016bea95f2f68021f6"} Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.277261 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.277287 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"048cce91cf5c84987e86ee8f7988f1a08c3f3e7168ceba5890457bba41f9f3fd"} Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.277315 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.277332 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"f61c0433438868c57b99e9e196f99e5acf63952eb171ad4214601c829fcdb973"} Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.277348 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"c7d384b09d43d69c416e91c5a773079b87e839ccd45182facad6f1ce1749cdd6"} Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.368706 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.393553 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.422547 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.449576 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.458979 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:11:58 crc kubenswrapper[4937]: E0225 16:11:58.459814 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac4754b-f29c-43cf-9bae-9902a5670866" containerName="init" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.459829 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac4754b-f29c-43cf-9bae-9902a5670866" containerName="init" Feb 25 16:11:58 crc kubenswrapper[4937]: E0225 16:11:58.459846 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc191ad9-1ac9-4372-a31e-dabd1b174031" containerName="init" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.459852 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc191ad9-1ac9-4372-a31e-dabd1b174031" containerName="init" Feb 25 16:11:58 crc kubenswrapper[4937]: E0225 16:11:58.459864 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" containerName="glance-log" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.459870 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" containerName="glance-log" Feb 25 16:11:58 crc kubenswrapper[4937]: E0225 16:11:58.459881 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" containerName="glance-httpd" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.459887 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" containerName="glance-httpd" Feb 25 16:11:58 crc kubenswrapper[4937]: E0225 16:11:58.459898 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" containerName="glance-log" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.459904 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" containerName="glance-log" Feb 25 16:11:58 crc kubenswrapper[4937]: E0225 16:11:58.459918 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" containerName="glance-httpd" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.459923 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" containerName="glance-httpd" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.460129 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" containerName="glance-log" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.460152 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac4754b-f29c-43cf-9bae-9902a5670866" containerName="init" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.460166 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc191ad9-1ac9-4372-a31e-dabd1b174031" containerName="init" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.460178 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" containerName="glance-httpd" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.460191 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" containerName="glance-log" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.460201 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" containerName="glance-httpd" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.461691 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.466990 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.467218 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-86t29" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.467421 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.467566 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.489865 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.530350 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.532003 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.532076 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.536550 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.536769 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.591133 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c021e5b5-9038-4a91-8785-7461a1d3c981-logs\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.591225 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kz8v\" (UniqueName: \"kubernetes.io/projected/c021e5b5-9038-4a91-8785-7461a1d3c981-kube-api-access-6kz8v\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.591306 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.591369 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.591394 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c021e5b5-9038-4a91-8785-7461a1d3c981-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.591420 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.591442 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.591492 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.692833 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.692878 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c021e5b5-9038-4a91-8785-7461a1d3c981-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.692905 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.692922 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.692956 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.692988 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c021e5b5-9038-4a91-8785-7461a1d3c981-logs\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.693025 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fac18368-fe1e-4431-bf59-1c1e613bc0d6-logs\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.693059 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kz8v\" (UniqueName: \"kubernetes.io/projected/c021e5b5-9038-4a91-8785-7461a1d3c981-kube-api-access-6kz8v\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.693079 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fac18368-fe1e-4431-bf59-1c1e613bc0d6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.693106 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-scripts\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.693132 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.693197 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.693212 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjgzz\" (UniqueName: \"kubernetes.io/projected/fac18368-fe1e-4431-bf59-1c1e613bc0d6-kube-api-access-cjgzz\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.693226 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.693254 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-config-data\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.693278 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.697090 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c021e5b5-9038-4a91-8785-7461a1d3c981-logs\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.698323 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c021e5b5-9038-4a91-8785-7461a1d3c981-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.707837 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.709678 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.754390 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.755991 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.761643 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kz8v\" (UniqueName: \"kubernetes.io/projected/c021e5b5-9038-4a91-8785-7461a1d3c981-kube-api-access-6kz8v\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.764391 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.764436 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/53a683be7e1f0cfe8980b8900c6ef26fb8068fb3b32445402dc99b2c4e60848d/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.794387 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fac18368-fe1e-4431-bf59-1c1e613bc0d6-logs\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.794458 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fac18368-fe1e-4431-bf59-1c1e613bc0d6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.794507 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-scripts\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.794531 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.794564 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjgzz\" (UniqueName: \"kubernetes.io/projected/fac18368-fe1e-4431-bf59-1c1e613bc0d6-kube-api-access-cjgzz\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.794581 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.794609 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-config-data\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.794636 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.796587 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fac18368-fe1e-4431-bf59-1c1e613bc0d6-logs\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.796854 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fac18368-fe1e-4431-bf59-1c1e613bc0d6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.805012 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.806120 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.807232 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-scripts\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.817859 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-config-data\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.822575 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjgzz\" (UniqueName: \"kubernetes.io/projected/fac18368-fe1e-4431-bf59-1c1e613bc0d6-kube-api-access-cjgzz\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.935544 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:11:58 crc kubenswrapper[4937]: I0225 16:11:58.935591 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/75804d914e821b5e0a9ece92cf1b2b0f3da08753d3294f4ad4199fba37b189f7/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 25 16:11:59 crc kubenswrapper[4937]: I0225 16:11:59.052013 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") pod \"glance-default-external-api-0\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " pod="openstack/glance-default-external-api-0" Feb 25 16:11:59 crc kubenswrapper[4937]: I0225 16:11:59.054651 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") pod \"glance-default-internal-api-0\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:11:59 crc kubenswrapper[4937]: I0225 16:11:59.112341 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 16:11:59 crc kubenswrapper[4937]: I0225 16:11:59.155475 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 16:11:59 crc kubenswrapper[4937]: I0225 16:11:59.325667 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"91cbb0409fce91330a0998a171430ecaa7bde77ae05a299b97cfffe83fd34328"} Feb 25 16:11:59 crc kubenswrapper[4937]: I0225 16:11:59.335927 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:11:59 crc kubenswrapper[4937]: I0225 16:11:59.336157 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="config-reloader" containerID="cri-o://fc97b8e06f2912e439a8df9a3ca91b06cb74e379b0a69a88fd74be43db38ea21" gracePeriod=600 Feb 25 16:11:59 crc kubenswrapper[4937]: I0225 16:11:59.336262 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="prometheus" containerID="cri-o://a6f706d7310d97b2900b971c494236f99adf6904c65548a724f429abe7365374" gracePeriod=600 Feb 25 16:11:59 crc kubenswrapper[4937]: I0225 16:11:59.336297 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="thanos-sidecar" containerID="cri-o://ce7810c7559494ae2ce2d68e150d814e5181253265d082b3bf485bde3ffceb08" gracePeriod=600 Feb 25 16:11:59 crc kubenswrapper[4937]: I0225 16:11:59.399928 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fbbb4f6-1ce2-4663-a3bb-a8bd16630925" path="/var/lib/kubelet/pods/7fbbb4f6-1ce2-4663-a3bb-a8bd16630925/volumes" Feb 25 16:11:59 crc kubenswrapper[4937]: I0225 16:11:59.400844 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84dfe2e8-5a8a-4fad-ad83-3db116c23bd8" path="/var/lib/kubelet/pods/84dfe2e8-5a8a-4fad-ad83-3db116c23bd8/volumes" Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.133304 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533932-w2rgh"] Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.134961 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533932-w2rgh" Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.138257 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.138561 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.141866 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533932-w2rgh"] Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.138829 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.231241 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxc5s\" (UniqueName: \"kubernetes.io/projected/fa840e84-cc33-4fed-9f62-50fc286001be-kube-api-access-pxc5s\") pod \"auto-csr-approver-29533932-w2rgh\" (UID: \"fa840e84-cc33-4fed-9f62-50fc286001be\") " pod="openshift-infra/auto-csr-approver-29533932-w2rgh" Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.333052 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxc5s\" (UniqueName: \"kubernetes.io/projected/fa840e84-cc33-4fed-9f62-50fc286001be-kube-api-access-pxc5s\") pod \"auto-csr-approver-29533932-w2rgh\" (UID: \"fa840e84-cc33-4fed-9f62-50fc286001be\") " pod="openshift-infra/auto-csr-approver-29533932-w2rgh" Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.347149 4937 generic.go:334] "Generic (PLEG): container finished" podID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerID="a6f706d7310d97b2900b971c494236f99adf6904c65548a724f429abe7365374" exitCode=0 Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.347190 4937 generic.go:334] "Generic (PLEG): container finished" podID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerID="ce7810c7559494ae2ce2d68e150d814e5181253265d082b3bf485bde3ffceb08" exitCode=0 Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.347204 4937 generic.go:334] "Generic (PLEG): container finished" podID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerID="fc97b8e06f2912e439a8df9a3ca91b06cb74e379b0a69a88fd74be43db38ea21" exitCode=0 Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.347226 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e19ac505-41d9-4d1d-b75a-0c88e26960c8","Type":"ContainerDied","Data":"a6f706d7310d97b2900b971c494236f99adf6904c65548a724f429abe7365374"} Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.347271 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e19ac505-41d9-4d1d-b75a-0c88e26960c8","Type":"ContainerDied","Data":"ce7810c7559494ae2ce2d68e150d814e5181253265d082b3bf485bde3ffceb08"} Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.347284 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e19ac505-41d9-4d1d-b75a-0c88e26960c8","Type":"ContainerDied","Data":"fc97b8e06f2912e439a8df9a3ca91b06cb74e379b0a69a88fd74be43db38ea21"} Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.354813 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxc5s\" (UniqueName: \"kubernetes.io/projected/fa840e84-cc33-4fed-9f62-50fc286001be-kube-api-access-pxc5s\") pod \"auto-csr-approver-29533932-w2rgh\" (UID: \"fa840e84-cc33-4fed-9f62-50fc286001be\") " pod="openshift-infra/auto-csr-approver-29533932-w2rgh" Feb 25 16:12:00 crc kubenswrapper[4937]: I0225 16:12:00.464785 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533932-w2rgh" Feb 25 16:12:01 crc kubenswrapper[4937]: I0225 16:12:01.242531 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.137:9090/-/ready\": dial tcp 10.217.0.137:9090: connect: connection refused" Feb 25 16:12:02 crc kubenswrapper[4937]: I0225 16:12:02.730511 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:12:02 crc kubenswrapper[4937]: I0225 16:12:02.800975 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-6xfpd"] Feb 25 16:12:02 crc kubenswrapper[4937]: I0225 16:12:02.801206 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-6xfpd" podUID="b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" containerName="dnsmasq-dns" containerID="cri-o://dca5c99c0ff4635c5eebdf665c288bff87fa4b0ee4dbd978ff021493176fd38a" gracePeriod=10 Feb 25 16:12:03 crc kubenswrapper[4937]: I0225 16:12:03.375317 4937 generic.go:334] "Generic (PLEG): container finished" podID="b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" containerID="dca5c99c0ff4635c5eebdf665c288bff87fa4b0ee4dbd978ff021493176fd38a" exitCode=0 Feb 25 16:12:03 crc kubenswrapper[4937]: I0225 16:12:03.377734 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-6xfpd" event={"ID":"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7","Type":"ContainerDied","Data":"dca5c99c0ff4635c5eebdf665c288bff87fa4b0ee4dbd978ff021493176fd38a"} Feb 25 16:12:03 crc kubenswrapper[4937]: I0225 16:12:03.548942 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-6xfpd" podUID="b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Feb 25 16:12:05 crc kubenswrapper[4937]: I0225 16:12:05.636617 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-754k4"] Feb 25 16:12:05 crc kubenswrapper[4937]: I0225 16:12:05.641877 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:12:05 crc kubenswrapper[4937]: I0225 16:12:05.657859 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-754k4"] Feb 25 16:12:05 crc kubenswrapper[4937]: I0225 16:12:05.752417 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztnnx\" (UniqueName: \"kubernetes.io/projected/82cf475b-cc29-4d90-a1c4-73e0170f0f48-kube-api-access-ztnnx\") pod \"redhat-operators-754k4\" (UID: \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\") " pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:12:05 crc kubenswrapper[4937]: I0225 16:12:05.752876 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82cf475b-cc29-4d90-a1c4-73e0170f0f48-catalog-content\") pod \"redhat-operators-754k4\" (UID: \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\") " pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:12:05 crc kubenswrapper[4937]: I0225 16:12:05.752966 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82cf475b-cc29-4d90-a1c4-73e0170f0f48-utilities\") pod \"redhat-operators-754k4\" (UID: \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\") " pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:12:05 crc kubenswrapper[4937]: I0225 16:12:05.854918 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82cf475b-cc29-4d90-a1c4-73e0170f0f48-utilities\") pod \"redhat-operators-754k4\" (UID: \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\") " pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:12:05 crc kubenswrapper[4937]: I0225 16:12:05.855010 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztnnx\" (UniqueName: \"kubernetes.io/projected/82cf475b-cc29-4d90-a1c4-73e0170f0f48-kube-api-access-ztnnx\") pod \"redhat-operators-754k4\" (UID: \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\") " pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:12:05 crc kubenswrapper[4937]: I0225 16:12:05.855167 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82cf475b-cc29-4d90-a1c4-73e0170f0f48-catalog-content\") pod \"redhat-operators-754k4\" (UID: \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\") " pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:12:05 crc kubenswrapper[4937]: I0225 16:12:05.855437 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82cf475b-cc29-4d90-a1c4-73e0170f0f48-utilities\") pod \"redhat-operators-754k4\" (UID: \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\") " pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:12:05 crc kubenswrapper[4937]: I0225 16:12:05.855584 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82cf475b-cc29-4d90-a1c4-73e0170f0f48-catalog-content\") pod \"redhat-operators-754k4\" (UID: \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\") " pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:12:05 crc kubenswrapper[4937]: I0225 16:12:05.876103 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztnnx\" (UniqueName: \"kubernetes.io/projected/82cf475b-cc29-4d90-a1c4-73e0170f0f48-kube-api-access-ztnnx\") pod \"redhat-operators-754k4\" (UID: \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\") " pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:12:05 crc kubenswrapper[4937]: I0225 16:12:05.971226 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:12:06 crc kubenswrapper[4937]: I0225 16:12:06.242085 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.137:9090/-/ready\": dial tcp 10.217.0.137:9090: connect: connection refused" Feb 25 16:12:08 crc kubenswrapper[4937]: I0225 16:12:08.548840 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-6xfpd" podUID="b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Feb 25 16:12:13 crc kubenswrapper[4937]: I0225 16:12:13.977126 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.033679 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-fernet-keys\") pod \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.033808 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mnlg\" (UniqueName: \"kubernetes.io/projected/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-kube-api-access-2mnlg\") pod \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.033853 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-combined-ca-bundle\") pod \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.033959 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-credential-keys\") pod \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.034002 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-config-data\") pod \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.034022 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-scripts\") pod \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\" (UID: \"aa5bc946-0e69-40ae-a76e-06e9d477bcaf\") " Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.044864 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "aa5bc946-0e69-40ae-a76e-06e9d477bcaf" (UID: "aa5bc946-0e69-40ae-a76e-06e9d477bcaf"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.044923 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "aa5bc946-0e69-40ae-a76e-06e9d477bcaf" (UID: "aa5bc946-0e69-40ae-a76e-06e9d477bcaf"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.049337 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-kube-api-access-2mnlg" (OuterVolumeSpecName: "kube-api-access-2mnlg") pod "aa5bc946-0e69-40ae-a76e-06e9d477bcaf" (UID: "aa5bc946-0e69-40ae-a76e-06e9d477bcaf"). InnerVolumeSpecName "kube-api-access-2mnlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.066712 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-scripts" (OuterVolumeSpecName: "scripts") pod "aa5bc946-0e69-40ae-a76e-06e9d477bcaf" (UID: "aa5bc946-0e69-40ae-a76e-06e9d477bcaf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.082185 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-config-data" (OuterVolumeSpecName: "config-data") pod "aa5bc946-0e69-40ae-a76e-06e9d477bcaf" (UID: "aa5bc946-0e69-40ae-a76e-06e9d477bcaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.108821 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa5bc946-0e69-40ae-a76e-06e9d477bcaf" (UID: "aa5bc946-0e69-40ae-a76e-06e9d477bcaf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.136910 4937 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.136942 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.136973 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.136984 4937 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.136996 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mnlg\" (UniqueName: \"kubernetes.io/projected/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-kube-api-access-2mnlg\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.137009 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bc946-0e69-40ae-a76e-06e9d477bcaf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.242841 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.137:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.242988 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.494027 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x95b4" event={"ID":"aa5bc946-0e69-40ae-a76e-06e9d477bcaf","Type":"ContainerDied","Data":"9f5121da44b26ec2522fb7b115d911111e69754a759a9b94350d9d6eb60d8bb3"} Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.494066 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f5121da44b26ec2522fb7b115d911111e69754a759a9b94350d9d6eb60d8bb3" Feb 25 16:12:14 crc kubenswrapper[4937]: I0225 16:12:14.494078 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x95b4" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.180142 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-x95b4"] Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.195061 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-x95b4"] Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.248343 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ldjsj"] Feb 25 16:12:15 crc kubenswrapper[4937]: E0225 16:12:15.249019 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5bc946-0e69-40ae-a76e-06e9d477bcaf" containerName="keystone-bootstrap" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.249106 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5bc946-0e69-40ae-a76e-06e9d477bcaf" containerName="keystone-bootstrap" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.249367 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5bc946-0e69-40ae-a76e-06e9d477bcaf" containerName="keystone-bootstrap" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.250182 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.254225 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.254795 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.254945 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d9p7q" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.254983 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.269576 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ldjsj"] Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.362385 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-combined-ca-bundle\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.362427 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-config-data\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.362456 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-credential-keys\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.362497 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-scripts\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.362526 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-fernet-keys\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.362559 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nppm2\" (UniqueName: \"kubernetes.io/projected/69b3388f-2762-4ebe-a014-e1740aee3b66-kube-api-access-nppm2\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.390695 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5bc946-0e69-40ae-a76e-06e9d477bcaf" path="/var/lib/kubelet/pods/aa5bc946-0e69-40ae-a76e-06e9d477bcaf/volumes" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.463974 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-fernet-keys\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.464026 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nppm2\" (UniqueName: \"kubernetes.io/projected/69b3388f-2762-4ebe-a014-e1740aee3b66-kube-api-access-nppm2\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.464181 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-combined-ca-bundle\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.464202 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-config-data\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.464238 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-credential-keys\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.464277 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-scripts\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.472232 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-fernet-keys\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.481887 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-scripts\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.482756 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nppm2\" (UniqueName: \"kubernetes.io/projected/69b3388f-2762-4ebe-a014-e1740aee3b66-kube-api-access-nppm2\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.484983 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-config-data\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.485091 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-combined-ca-bundle\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.487999 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-credential-keys\") pod \"keystone-bootstrap-ldjsj\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:15 crc kubenswrapper[4937]: I0225 16:12:15.574270 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:18 crc kubenswrapper[4937]: I0225 16:12:18.556915 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-6xfpd" podUID="b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Feb 25 16:12:18 crc kubenswrapper[4937]: I0225 16:12:18.557089 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:12:19 crc kubenswrapper[4937]: I0225 16:12:19.242001 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.137:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.591977 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.605812 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.607105 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e19ac505-41d9-4d1d-b75a-0c88e26960c8","Type":"ContainerDied","Data":"4b993c8a26f5fc338f51f2888f7966159661bbf45b30ab38232fb63135a2c92d"} Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.607159 4937 scope.go:117] "RemoveContainer" containerID="a6f706d7310d97b2900b971c494236f99adf6904c65548a724f429abe7365374" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.615230 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-6xfpd" event={"ID":"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7","Type":"ContainerDied","Data":"5fe907070a15ab85aceaea53620ddb63128a2a1a7b6b8625ef7e4170f81f61ba"} Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.615364 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-6xfpd" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.724055 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-web-config\") pod \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.724110 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-config\") pod \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.724195 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-thanos-prometheus-http-client-file\") pod \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.724238 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-ovsdbserver-nb\") pod \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.724713 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e19ac505-41d9-4d1d-b75a-0c88e26960c8-config-out\") pod \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.724791 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmxns\" (UniqueName: \"kubernetes.io/projected/e19ac505-41d9-4d1d-b75a-0c88e26960c8-kube-api-access-zmxns\") pod \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.724848 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-2\") pod \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.724990 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-config\") pod \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.725063 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-1\") pod \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.725095 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e19ac505-41d9-4d1d-b75a-0c88e26960c8-tls-assets\") pod \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.725194 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrncf\" (UniqueName: \"kubernetes.io/projected/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-kube-api-access-vrncf\") pod \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.725252 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-dns-svc\") pod \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.725331 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-0\") pod \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.725407 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-ovsdbserver-sb\") pod \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\" (UID: \"b8f0c1cf-9d3a-4974-9959-2a6327b9dac7\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.725608 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") pod \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\" (UID: \"e19ac505-41d9-4d1d-b75a-0c88e26960c8\") " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.725681 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "e19ac505-41d9-4d1d-b75a-0c88e26960c8" (UID: "e19ac505-41d9-4d1d-b75a-0c88e26960c8"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.726224 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "e19ac505-41d9-4d1d-b75a-0c88e26960c8" (UID: "e19ac505-41d9-4d1d-b75a-0c88e26960c8"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.726382 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "e19ac505-41d9-4d1d-b75a-0c88e26960c8" (UID: "e19ac505-41d9-4d1d-b75a-0c88e26960c8"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.726651 4937 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.726758 4937 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.726846 4937 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e19ac505-41d9-4d1d-b75a-0c88e26960c8-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.729247 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e19ac505-41d9-4d1d-b75a-0c88e26960c8" (UID: "e19ac505-41d9-4d1d-b75a-0c88e26960c8"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.730635 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e19ac505-41d9-4d1d-b75a-0c88e26960c8-config-out" (OuterVolumeSpecName: "config-out") pod "e19ac505-41d9-4d1d-b75a-0c88e26960c8" (UID: "e19ac505-41d9-4d1d-b75a-0c88e26960c8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.730647 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-config" (OuterVolumeSpecName: "config") pod "e19ac505-41d9-4d1d-b75a-0c88e26960c8" (UID: "e19ac505-41d9-4d1d-b75a-0c88e26960c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.730982 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e19ac505-41d9-4d1d-b75a-0c88e26960c8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e19ac505-41d9-4d1d-b75a-0c88e26960c8" (UID: "e19ac505-41d9-4d1d-b75a-0c88e26960c8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.733749 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e19ac505-41d9-4d1d-b75a-0c88e26960c8-kube-api-access-zmxns" (OuterVolumeSpecName: "kube-api-access-zmxns") pod "e19ac505-41d9-4d1d-b75a-0c88e26960c8" (UID: "e19ac505-41d9-4d1d-b75a-0c88e26960c8"). InnerVolumeSpecName "kube-api-access-zmxns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.733796 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-kube-api-access-vrncf" (OuterVolumeSpecName: "kube-api-access-vrncf") pod "b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" (UID: "b8f0c1cf-9d3a-4974-9959-2a6327b9dac7"). InnerVolumeSpecName "kube-api-access-vrncf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.750160 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "e19ac505-41d9-4d1d-b75a-0c88e26960c8" (UID: "e19ac505-41d9-4d1d-b75a-0c88e26960c8"). InnerVolumeSpecName "pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.772767 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" (UID: "b8f0c1cf-9d3a-4974-9959-2a6327b9dac7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.777797 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-config" (OuterVolumeSpecName: "config") pod "b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" (UID: "b8f0c1cf-9d3a-4974-9959-2a6327b9dac7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.784231 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" (UID: "b8f0c1cf-9d3a-4974-9959-2a6327b9dac7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.785943 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-web-config" (OuterVolumeSpecName: "web-config") pod "e19ac505-41d9-4d1d-b75a-0c88e26960c8" (UID: "e19ac505-41d9-4d1d-b75a-0c88e26960c8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.791909 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" (UID: "b8f0c1cf-9d3a-4974-9959-2a6327b9dac7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.829285 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.829370 4937 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") on node \"crc\" " Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.829414 4937 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-web-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.829429 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.829442 4937 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e19ac505-41d9-4d1d-b75a-0c88e26960c8-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.829455 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.829468 4937 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e19ac505-41d9-4d1d-b75a-0c88e26960c8-config-out\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.829513 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmxns\" (UniqueName: \"kubernetes.io/projected/e19ac505-41d9-4d1d-b75a-0c88e26960c8-kube-api-access-zmxns\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.829528 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.829539 4937 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e19ac505-41d9-4d1d-b75a-0c88e26960c8-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.829549 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrncf\" (UniqueName: \"kubernetes.io/projected/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-kube-api-access-vrncf\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.829560 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.873319 4937 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.873611 4937 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4") on node "crc" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.931092 4937 reconciler_common.go:293] "Volume detached for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.956604 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-6xfpd"] Feb 25 16:12:22 crc kubenswrapper[4937]: I0225 16:12:22.964682 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-6xfpd"] Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.388724 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" path="/var/lib/kubelet/pods/b8f0c1cf-9d3a-4974-9959-2a6327b9dac7/volumes" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.561910 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-6xfpd" podUID="b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.633073 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.675237 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.683859 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.701798 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:12:23 crc kubenswrapper[4937]: E0225 16:12:23.702170 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="config-reloader" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.702185 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="config-reloader" Feb 25 16:12:23 crc kubenswrapper[4937]: E0225 16:12:23.702199 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" containerName="dnsmasq-dns" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.702205 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" containerName="dnsmasq-dns" Feb 25 16:12:23 crc kubenswrapper[4937]: E0225 16:12:23.702213 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="thanos-sidecar" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.702219 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="thanos-sidecar" Feb 25 16:12:23 crc kubenswrapper[4937]: E0225 16:12:23.702227 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="init-config-reloader" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.702233 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="init-config-reloader" Feb 25 16:12:23 crc kubenswrapper[4937]: E0225 16:12:23.702246 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="prometheus" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.702251 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="prometheus" Feb 25 16:12:23 crc kubenswrapper[4937]: E0225 16:12:23.702271 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" containerName="init" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.702276 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" containerName="init" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.702433 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f0c1cf-9d3a-4974-9959-2a6327b9dac7" containerName="dnsmasq-dns" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.702448 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="prometheus" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.702465 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="config-reloader" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.702478 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="thanos-sidecar" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.704452 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.707881 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.708033 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.708216 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-8sz4b" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.708233 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.708415 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.708685 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.708813 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.708723 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.715340 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.728872 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.853921 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.853981 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.854003 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.854187 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.854240 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.854269 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.854310 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.854401 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.854556 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.854584 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.854680 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.854737 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.854999 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhcb4\" (UniqueName: \"kubernetes.io/projected/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-kube-api-access-jhcb4\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.957196 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhcb4\" (UniqueName: \"kubernetes.io/projected/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-kube-api-access-jhcb4\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.957289 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.957338 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.957364 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.957416 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.957439 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.957469 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.957518 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.957542 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.957592 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.957613 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.957654 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.957691 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.958616 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.960817 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.965376 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.972201 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.972327 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.973614 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.975129 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.975198 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.979115 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.979698 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.992605 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:12:23 crc kubenswrapper[4937]: I0225 16:12:23.992647 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bcbb07c66c16a8bfff5bebc3b08f557f1544f8693883973b0b05074d97af7e5f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:24 crc kubenswrapper[4937]: I0225 16:12:24.005175 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:24 crc kubenswrapper[4937]: I0225 16:12:24.010303 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhcb4\" (UniqueName: \"kubernetes.io/projected/1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a-kube-api-access-jhcb4\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:24 crc kubenswrapper[4937]: I0225 16:12:24.064971 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f9d7478-2ac8-4cd3-8038-3b573fa296a4\") pod \"prometheus-metric-storage-0\" (UID: \"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a\") " pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:24 crc kubenswrapper[4937]: I0225 16:12:24.241436 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.137:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 16:12:24 crc kubenswrapper[4937]: I0225 16:12:24.355913 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:25 crc kubenswrapper[4937]: I0225 16:12:25.377601 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e19ac505-41d9-4d1d-b75a-0c88e26960c8" path="/var/lib/kubelet/pods/e19ac505-41d9-4d1d-b75a-0c88e26960c8/volumes" Feb 25 16:12:25 crc kubenswrapper[4937]: E0225 16:12:25.410017 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 25 16:12:25 crc kubenswrapper[4937]: E0225 16:12:25.410249 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z5cb6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-7w4sz_openstack(38a537ec-7743-44bd-b428-fa52adf39305): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:12:25 crc kubenswrapper[4937]: E0225 16:12:25.411362 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-7w4sz" podUID="38a537ec-7743-44bd-b428-fa52adf39305" Feb 25 16:12:25 crc kubenswrapper[4937]: E0225 16:12:25.658076 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-7w4sz" podUID="38a537ec-7743-44bd-b428-fa52adf39305" Feb 25 16:12:25 crc kubenswrapper[4937]: E0225 16:12:25.838918 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 25 16:12:25 crc kubenswrapper[4937]: E0225 16:12:25.839134 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55bh87hcchffh665h7ch76h658h696h5dhbchdfhc8hc4hdch5d6h656h56bh99h556h566h5cbhdh66chfh9ch65fhfbh5d9h5bhf8hf5q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hgg8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(8ef5e4a5-46f1-4f72-ab91-699865d33243): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:12:25 crc kubenswrapper[4937]: I0225 16:12:25.841693 4937 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 16:12:27 crc kubenswrapper[4937]: E0225 16:12:27.706688 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 25 16:12:27 crc kubenswrapper[4937]: E0225 16:12:27.707245 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zp2c7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6rpk2_openstack(006fb5e7-a244-4758-8065-3615f5a2b9b7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:12:27 crc kubenswrapper[4937]: E0225 16:12:27.708462 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6rpk2" podUID="006fb5e7-a244-4758-8065-3615f5a2b9b7" Feb 25 16:12:28 crc kubenswrapper[4937]: E0225 16:12:28.682910 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-6rpk2" podUID="006fb5e7-a244-4758-8065-3615f5a2b9b7" Feb 25 16:12:29 crc kubenswrapper[4937]: E0225 16:12:29.918043 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-swift-object:current-podified" Feb 25 16:12:29 crc kubenswrapper[4937]: E0225 16:12:29.918602 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:object-server,Image:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,Command:[/usr/bin/swift-object-server /etc/swift/object-server.conf.d -v],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:object,HostPort:0,ContainerPort:6200,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55chf6h56fhc6h59fhfbh59fh544h5b7h6bh65dh58dh5d7h8ch67chbfh8bh657hcdh89h58bhbch58fhf9h67dh68fh54ch59fh79hc6hc6h5cdq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:swift,ReadOnly:false,MountPath:/srv/node/pv,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-swift,ReadOnly:false,MountPath:/etc/swift,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cache,ReadOnly:false,MountPath:/var/cache/swift,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lock,ReadOnly:false,MountPath:/var/lock,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-76wnb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42445,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-storage-0_openstack(48d22af0-5579-46fb-889d-fd34e46d26e9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:12:30 crc kubenswrapper[4937]: I0225 16:12:30.452721 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533932-w2rgh"] Feb 25 16:12:30 crc kubenswrapper[4937]: I0225 16:12:30.557147 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:12:30 crc kubenswrapper[4937]: I0225 16:12:30.696831 4937 generic.go:334] "Generic (PLEG): container finished" podID="90764917-3dc9-4778-b224-67cb4ae1e49d" containerID="f828440b21ea18ae8263a8d3f11bc51cf5f99ec4284f45e31327cfe60dba52ee" exitCode=0 Feb 25 16:12:30 crc kubenswrapper[4937]: I0225 16:12:30.696871 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4vmm4" event={"ID":"90764917-3dc9-4778-b224-67cb4ae1e49d","Type":"ContainerDied","Data":"f828440b21ea18ae8263a8d3f11bc51cf5f99ec4284f45e31327cfe60dba52ee"} Feb 25 16:12:32 crc kubenswrapper[4937]: I0225 16:12:32.844975 4937 scope.go:117] "RemoveContainer" containerID="ce7810c7559494ae2ce2d68e150d814e5181253265d082b3bf485bde3ffceb08" Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.398390 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-754k4"] Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.402327 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:12:33 crc kubenswrapper[4937]: W0225 16:12:33.544921 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82cf475b_cc29_4d90_a1c4_73e0170f0f48.slice/crio-1ed260763928b99b6d2133ce3f4b71d78aa05aa1ba38e1c597ef7b2c8493166e WatchSource:0}: Error finding container 1ed260763928b99b6d2133ce3f4b71d78aa05aa1ba38e1c597ef7b2c8493166e: Status 404 returned error can't find the container with id 1ed260763928b99b6d2133ce3f4b71d78aa05aa1ba38e1c597ef7b2c8493166e Feb 25 16:12:33 crc kubenswrapper[4937]: W0225 16:12:33.547664 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfac18368_fe1e_4431_bf59_1c1e613bc0d6.slice/crio-ea8f093f290906ca0d673b66786333b2793cfe8a4754f9c0cd6c31226fc0e749 WatchSource:0}: Error finding container ea8f093f290906ca0d673b66786333b2793cfe8a4754f9c0cd6c31226fc0e749: Status 404 returned error can't find the container with id ea8f093f290906ca0d673b66786333b2793cfe8a4754f9c0cd6c31226fc0e749 Feb 25 16:12:33 crc kubenswrapper[4937]: E0225 16:12:33.578792 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 25 16:12:33 crc kubenswrapper[4937]: E0225 16:12:33.578870 4937 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 25 16:12:33 crc kubenswrapper[4937]: E0225 16:12:33.579110 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn6jr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-b4zhr_openstack(44849697-9b41-4439-b8c7-f497036543aa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:12:33 crc kubenswrapper[4937]: E0225 16:12:33.581561 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-b4zhr" podUID="44849697-9b41-4439-b8c7-f497036543aa" Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.653641 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4vmm4" Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.725637 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-754k4" event={"ID":"82cf475b-cc29-4d90-a1c4-73e0170f0f48","Type":"ContainerStarted","Data":"1ed260763928b99b6d2133ce3f4b71d78aa05aa1ba38e1c597ef7b2c8493166e"} Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.727404 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fac18368-fe1e-4431-bf59-1c1e613bc0d6","Type":"ContainerStarted","Data":"ea8f093f290906ca0d673b66786333b2793cfe8a4754f9c0cd6c31226fc0e749"} Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.729322 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4vmm4" event={"ID":"90764917-3dc9-4778-b224-67cb4ae1e49d","Type":"ContainerDied","Data":"9f5c3312593d86df03e5320ca5220e3801b7aeb0e031908f1259136690d83463"} Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.729364 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f5c3312593d86df03e5320ca5220e3801b7aeb0e031908f1259136690d83463" Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.729434 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4vmm4" Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.734452 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533932-w2rgh" event={"ID":"fa840e84-cc33-4fed-9f62-50fc286001be","Type":"ContainerStarted","Data":"3efe1b4622fd755b3226fe123c971036c3b3ee40c7217fe35149e316c55946c7"} Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.736288 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c021e5b5-9038-4a91-8785-7461a1d3c981","Type":"ContainerStarted","Data":"7f79aa15a8ee531ad3e279dbe486515d32169aa55bedb346ce0268c65dea4a2f"} Feb 25 16:12:33 crc kubenswrapper[4937]: E0225 16:12:33.738333 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-b4zhr" podUID="44849697-9b41-4439-b8c7-f497036543aa" Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.782748 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t87p\" (UniqueName: \"kubernetes.io/projected/90764917-3dc9-4778-b224-67cb4ae1e49d-kube-api-access-4t87p\") pod \"90764917-3dc9-4778-b224-67cb4ae1e49d\" (UID: \"90764917-3dc9-4778-b224-67cb4ae1e49d\") " Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.782919 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90764917-3dc9-4778-b224-67cb4ae1e49d-combined-ca-bundle\") pod \"90764917-3dc9-4778-b224-67cb4ae1e49d\" (UID: \"90764917-3dc9-4778-b224-67cb4ae1e49d\") " Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.783050 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90764917-3dc9-4778-b224-67cb4ae1e49d-config\") pod \"90764917-3dc9-4778-b224-67cb4ae1e49d\" (UID: \"90764917-3dc9-4778-b224-67cb4ae1e49d\") " Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.794825 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90764917-3dc9-4778-b224-67cb4ae1e49d-kube-api-access-4t87p" (OuterVolumeSpecName: "kube-api-access-4t87p") pod "90764917-3dc9-4778-b224-67cb4ae1e49d" (UID: "90764917-3dc9-4778-b224-67cb4ae1e49d"). InnerVolumeSpecName "kube-api-access-4t87p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.819177 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90764917-3dc9-4778-b224-67cb4ae1e49d-config" (OuterVolumeSpecName: "config") pod "90764917-3dc9-4778-b224-67cb4ae1e49d" (UID: "90764917-3dc9-4778-b224-67cb4ae1e49d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.821627 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90764917-3dc9-4778-b224-67cb4ae1e49d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90764917-3dc9-4778-b224-67cb4ae1e49d" (UID: "90764917-3dc9-4778-b224-67cb4ae1e49d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.885638 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/90764917-3dc9-4778-b224-67cb4ae1e49d-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.885687 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t87p\" (UniqueName: \"kubernetes.io/projected/90764917-3dc9-4778-b224-67cb4ae1e49d-kube-api-access-4t87p\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.885706 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90764917-3dc9-4778-b224-67cb4ae1e49d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:33 crc kubenswrapper[4937]: I0225 16:12:33.961940 4937 scope.go:117] "RemoveContainer" containerID="fc97b8e06f2912e439a8df9a3ca91b06cb74e379b0a69a88fd74be43db38ea21" Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.010766 4937 scope.go:117] "RemoveContainer" containerID="0f02def2b2987794ca53f24f3e1c837b05c0fb749e7d29a9c4bbf2f2caeb3500" Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.084924 4937 scope.go:117] "RemoveContainer" containerID="dca5c99c0ff4635c5eebdf665c288bff87fa4b0ee4dbd978ff021493176fd38a" Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.146193 4937 scope.go:117] "RemoveContainer" containerID="a240d8936cb0e8c601adb670e19ca512ef04be6aefd27f3fc71d9c22b2aa83ea" Feb 25 16:12:34 crc kubenswrapper[4937]: E0225 16:12:34.300266 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"object-server\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"object-replicator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-auditor\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-updater\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"rsync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"swift-recon-cron\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\"]" pod="openstack/swift-storage-0" podUID="48d22af0-5579-46fb-889d-fd34e46d26e9" Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.343156 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ldjsj"] Feb 25 16:12:34 crc kubenswrapper[4937]: W0225 16:12:34.403267 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69b3388f_2762_4ebe_a014_e1740aee3b66.slice/crio-46c0035f84f787435df418aab1c883e6c45db8e7804c218a77a868d711e7e7c6 WatchSource:0}: Error finding container 46c0035f84f787435df418aab1c883e6c45db8e7804c218a77a868d711e7e7c6: Status 404 returned error can't find the container with id 46c0035f84f787435df418aab1c883e6c45db8e7804c218a77a868d711e7e7c6 Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.538365 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.763654 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ldjsj" event={"ID":"69b3388f-2762-4ebe-a014-e1740aee3b66","Type":"ContainerStarted","Data":"24d96139ad175c0781598244dc1d9972d8a382730deb3a8b23a5a63248c6f156"} Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.764024 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ldjsj" event={"ID":"69b3388f-2762-4ebe-a014-e1740aee3b66","Type":"ContainerStarted","Data":"46c0035f84f787435df418aab1c883e6c45db8e7804c218a77a868d711e7e7c6"} Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.771608 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8tmw5" event={"ID":"8f988c32-d57e-4e63-add5-1e86a8818641","Type":"ContainerStarted","Data":"39c1d35910d9965962a2c3133843aac7413ceb0c3af56828f4d782dae8ca4270"} Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.779692 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ef5e4a5-46f1-4f72-ab91-699865d33243","Type":"ContainerStarted","Data":"975bf9c7209b653fdf9d6a2d25684b654b89782f6a2efef799f7e3f9f4ae31b7"} Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.791667 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c021e5b5-9038-4a91-8785-7461a1d3c981","Type":"ContainerStarted","Data":"81f89b2c2a68d55c05964191735c5787af8dbacc9320d0fab4fbce2e13c1458b"} Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.793793 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ldjsj" podStartSLOduration=19.793773507 podStartE2EDuration="19.793773507s" podCreationTimestamp="2026-02-25 16:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:12:34.78870855 +0000 UTC m=+1605.802100440" watchObservedRunningTime="2026-02-25 16:12:34.793773507 +0000 UTC m=+1605.807165397" Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.798875 4937 generic.go:334] "Generic (PLEG): container finished" podID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerID="5f4ed79ccc2b234a018f87b53b79e7cc63f7de11f0f0c3c27230bd9a989aae2c" exitCode=0 Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.798956 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-754k4" event={"ID":"82cf475b-cc29-4d90-a1c4-73e0170f0f48","Type":"ContainerDied","Data":"5f4ed79ccc2b234a018f87b53b79e7cc63f7de11f0f0c3c27230bd9a989aae2c"} Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.810461 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a","Type":"ContainerStarted","Data":"6347f5f20a2e393313cb6cb6f05da3a60eaa0277ea6bfccdfc287b17ce55dfd1"} Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.839270 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"9942655be559504f8831ef7738c6c21d0534706280e7ba82a1e24f09c2b0b6e5"} Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.857544 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fb745b69-cv6h6"] Feb 25 16:12:34 crc kubenswrapper[4937]: E0225 16:12:34.858066 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90764917-3dc9-4778-b224-67cb4ae1e49d" containerName="neutron-db-sync" Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.858082 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="90764917-3dc9-4778-b224-67cb4ae1e49d" containerName="neutron-db-sync" Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.858304 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="90764917-3dc9-4778-b224-67cb4ae1e49d" containerName="neutron-db-sync" Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.859691 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.861625 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8tmw5" podStartSLOduration=14.787322493 podStartE2EDuration="48.861614066s" podCreationTimestamp="2026-02-25 16:11:46 +0000 UTC" firstStartedPulling="2026-02-25 16:11:48.411392759 +0000 UTC m=+1559.424784649" lastFinishedPulling="2026-02-25 16:12:22.485684332 +0000 UTC m=+1593.499076222" observedRunningTime="2026-02-25 16:12:34.8410349 +0000 UTC m=+1605.854426790" watchObservedRunningTime="2026-02-25 16:12:34.861614066 +0000 UTC m=+1605.875005956" Feb 25 16:12:34 crc kubenswrapper[4937]: E0225 16:12:34.920788 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"object-server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-replicator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-auditor\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-updater\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"rsync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"swift-recon-cron\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\"]" pod="openstack/swift-storage-0" podUID="48d22af0-5579-46fb-889d-fd34e46d26e9" Feb 25 16:12:34 crc kubenswrapper[4937]: I0225 16:12:34.923119 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-cv6h6"] Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.016475 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9vsx\" (UniqueName: \"kubernetes.io/projected/de392ea2-2165-462b-83dc-d21599f4888b-kube-api-access-n9vsx\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.016548 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.016616 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-dns-svc\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.016720 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.016763 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-config\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.121734 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-config\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.121779 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9vsx\" (UniqueName: \"kubernetes.io/projected/de392ea2-2165-462b-83dc-d21599f4888b-kube-api-access-n9vsx\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.121806 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.121871 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-dns-svc\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.121946 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.123419 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-config\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.123468 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.123985 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-dns-svc\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.131332 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f86fff94d-29bhj"] Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.134745 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.134850 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.140064 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.140388 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.140635 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.148393 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hb9w4" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.167726 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f86fff94d-29bhj"] Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.189456 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9vsx\" (UniqueName: \"kubernetes.io/projected/de392ea2-2165-462b-83dc-d21599f4888b-kube-api-access-n9vsx\") pod \"dnsmasq-dns-fb745b69-cv6h6\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.241024 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-config\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.241108 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-httpd-config\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.241175 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-combined-ca-bundle\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.241199 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-ovndb-tls-certs\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.241226 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9xvs\" (UniqueName: \"kubernetes.io/projected/00113672-6314-4271-b571-682e18b9a920-kube-api-access-j9xvs\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.347125 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-combined-ca-bundle\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.347810 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-ovndb-tls-certs\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.347844 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9xvs\" (UniqueName: \"kubernetes.io/projected/00113672-6314-4271-b571-682e18b9a920-kube-api-access-j9xvs\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.348003 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-config\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.348077 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-httpd-config\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.353570 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-httpd-config\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.357031 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.357427 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-config\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.357974 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-combined-ca-bundle\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.366015 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-ovndb-tls-certs\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.382622 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9xvs\" (UniqueName: \"kubernetes.io/projected/00113672-6314-4271-b571-682e18b9a920-kube-api-access-j9xvs\") pod \"neutron-f86fff94d-29bhj\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.666991 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:35 crc kubenswrapper[4937]: I0225 16:12:35.906902 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fac18368-fe1e-4431-bf59-1c1e613bc0d6","Type":"ContainerStarted","Data":"62af60d64b8da5ff29d5b9384e40735a85c436fea31e028ac31eff1f534c467f"} Feb 25 16:12:35 crc kubenswrapper[4937]: E0225 16:12:35.969688 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"object-server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-replicator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-auditor\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-updater\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"rsync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"swift-recon-cron\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\"]" pod="openstack/swift-storage-0" podUID="48d22af0-5579-46fb-889d-fd34e46d26e9" Feb 25 16:12:36 crc kubenswrapper[4937]: I0225 16:12:36.054256 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-cv6h6"] Feb 25 16:12:36 crc kubenswrapper[4937]: I0225 16:12:36.552371 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f86fff94d-29bhj"] Feb 25 16:12:36 crc kubenswrapper[4937]: W0225 16:12:36.581024 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00113672_6314_4271_b571_682e18b9a920.slice/crio-036dc01cd54d6e8d73a02603e36cbf1736890b63d5b1b877422881489a96aacb WatchSource:0}: Error finding container 036dc01cd54d6e8d73a02603e36cbf1736890b63d5b1b877422881489a96aacb: Status 404 returned error can't find the container with id 036dc01cd54d6e8d73a02603e36cbf1736890b63d5b1b877422881489a96aacb Feb 25 16:12:36 crc kubenswrapper[4937]: I0225 16:12:36.925990 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f86fff94d-29bhj" event={"ID":"00113672-6314-4271-b571-682e18b9a920","Type":"ContainerStarted","Data":"036dc01cd54d6e8d73a02603e36cbf1736890b63d5b1b877422881489a96aacb"} Feb 25 16:12:36 crc kubenswrapper[4937]: I0225 16:12:36.929448 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c021e5b5-9038-4a91-8785-7461a1d3c981","Type":"ContainerStarted","Data":"6e1220f05b516229b945ce6ad87ff9617b4e5a496cf0806562e0602335e6f092"} Feb 25 16:12:36 crc kubenswrapper[4937]: I0225 16:12:36.941897 4937 generic.go:334] "Generic (PLEG): container finished" podID="de392ea2-2165-462b-83dc-d21599f4888b" containerID="3e1c5775ace7d313d3863739ee7d2d84dd18e635104cd379c0af96fca2dab63e" exitCode=0 Feb 25 16:12:36 crc kubenswrapper[4937]: I0225 16:12:36.941937 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-cv6h6" event={"ID":"de392ea2-2165-462b-83dc-d21599f4888b","Type":"ContainerDied","Data":"3e1c5775ace7d313d3863739ee7d2d84dd18e635104cd379c0af96fca2dab63e"} Feb 25 16:12:36 crc kubenswrapper[4937]: I0225 16:12:36.941960 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-cv6h6" event={"ID":"de392ea2-2165-462b-83dc-d21599f4888b","Type":"ContainerStarted","Data":"be945541a655da1c7a447ccd266c2e94a192275ad7aa6b57640fb9101787e3f6"} Feb 25 16:12:36 crc kubenswrapper[4937]: I0225 16:12:36.960435 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=38.960408816 podStartE2EDuration="38.960408816s" podCreationTimestamp="2026-02-25 16:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:12:36.951909923 +0000 UTC m=+1607.965301813" watchObservedRunningTime="2026-02-25 16:12:36.960408816 +0000 UTC m=+1607.973800706" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.764254 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-589f6455c9-dwk7g"] Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.766760 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.771362 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.771769 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.791159 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-589f6455c9-dwk7g"] Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.844352 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxwhn\" (UniqueName: \"kubernetes.io/projected/2a9fc39a-301c-48eb-8ae0-238271352711-kube-api-access-gxwhn\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.844705 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-combined-ca-bundle\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.844771 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-ovndb-tls-certs\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.844808 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-config\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.844854 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-public-tls-certs\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.844945 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-internal-tls-certs\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.844982 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-httpd-config\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.969624 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fac18368-fe1e-4431-bf59-1c1e613bc0d6","Type":"ContainerStarted","Data":"8ba797b8b788cb29e8b568c54553617699937dbc48e9377eb47b38e1dacbdf57"} Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.973572 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxwhn\" (UniqueName: \"kubernetes.io/projected/2a9fc39a-301c-48eb-8ae0-238271352711-kube-api-access-gxwhn\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.973643 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-combined-ca-bundle\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.973676 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-ovndb-tls-certs\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.973697 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-config\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.973720 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-public-tls-certs\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.973767 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-internal-tls-certs\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.973785 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-httpd-config\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.985684 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-config\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.989955 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-combined-ca-bundle\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.990735 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-ovndb-tls-certs\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.992310 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-public-tls-certs\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.994241 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxwhn\" (UniqueName: \"kubernetes.io/projected/2a9fc39a-301c-48eb-8ae0-238271352711-kube-api-access-gxwhn\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:37 crc kubenswrapper[4937]: I0225 16:12:37.997779 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-httpd-config\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.006226 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-internal-tls-certs\") pod \"neutron-589f6455c9-dwk7g\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.006478 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a","Type":"ContainerStarted","Data":"23cecae34658c8d20224bc287a30c3fe49fe078c805e9bce06b7cb26ec60abb9"} Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.032733 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=40.032714455 podStartE2EDuration="40.032714455s" podCreationTimestamp="2026-02-25 16:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:12:38.00701894 +0000 UTC m=+1609.020410840" watchObservedRunningTime="2026-02-25 16:12:38.032714455 +0000 UTC m=+1609.046106345" Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.039393 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533932-w2rgh" event={"ID":"fa840e84-cc33-4fed-9f62-50fc286001be","Type":"ContainerStarted","Data":"76045b66ae396f1673da50d35a9cdea3cc02e17bb8a1daaf6a0d60d35c0e65e4"} Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.092154 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-754k4" event={"ID":"82cf475b-cc29-4d90-a1c4-73e0170f0f48","Type":"ContainerStarted","Data":"fd138b73791dbd412c40686640180fa4e8e5a765bff1a7628fbfdfa25ff4a202"} Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.097267 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533932-w2rgh" podStartSLOduration=35.872722062 podStartE2EDuration="38.097246032s" podCreationTimestamp="2026-02-25 16:12:00 +0000 UTC" firstStartedPulling="2026-02-25 16:12:33.53246062 +0000 UTC m=+1604.545852530" lastFinishedPulling="2026-02-25 16:12:35.75698461 +0000 UTC m=+1606.770376500" observedRunningTime="2026-02-25 16:12:38.092406151 +0000 UTC m=+1609.105798041" watchObservedRunningTime="2026-02-25 16:12:38.097246032 +0000 UTC m=+1609.110637922" Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.113681 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-cv6h6" event={"ID":"de392ea2-2165-462b-83dc-d21599f4888b","Type":"ContainerStarted","Data":"81db60abb1ddaad73fa876838afd22b15acb3e798953036b822338fc96b2d90e"} Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.114606 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.115444 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.131145 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f86fff94d-29bhj" event={"ID":"00113672-6314-4271-b571-682e18b9a920","Type":"ContainerStarted","Data":"9588a5557ba29747508d1d9d1cd9fa4955e2495838a9f652499423a47557491d"} Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.131188 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f86fff94d-29bhj" event={"ID":"00113672-6314-4271-b571-682e18b9a920","Type":"ContainerStarted","Data":"b8afa98b9fa3cd878e6f97dd5d597e90939e2b1f892fd059bb1d43f89758f3f6"} Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.131206 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.160985 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f86fff94d-29bhj" podStartSLOduration=3.160962819 podStartE2EDuration="3.160962819s" podCreationTimestamp="2026-02-25 16:12:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:12:38.156952869 +0000 UTC m=+1609.170344759" watchObservedRunningTime="2026-02-25 16:12:38.160962819 +0000 UTC m=+1609.174354709" Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.201791 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fb745b69-cv6h6" podStartSLOduration=4.201765952 podStartE2EDuration="4.201765952s" podCreationTimestamp="2026-02-25 16:12:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:12:38.200027098 +0000 UTC m=+1609.213418988" watchObservedRunningTime="2026-02-25 16:12:38.201765952 +0000 UTC m=+1609.215157842" Feb 25 16:12:38 crc kubenswrapper[4937]: I0225 16:12:38.869211 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-589f6455c9-dwk7g"] Feb 25 16:12:39 crc kubenswrapper[4937]: I0225 16:12:39.113039 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 25 16:12:39 crc kubenswrapper[4937]: I0225 16:12:39.113505 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 25 16:12:39 crc kubenswrapper[4937]: I0225 16:12:39.152810 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589f6455c9-dwk7g" event={"ID":"2a9fc39a-301c-48eb-8ae0-238271352711","Type":"ContainerStarted","Data":"b649795795191f62ac09faac8c80c69067a83631ed639e3ea2f9db1a79474377"} Feb 25 16:12:39 crc kubenswrapper[4937]: I0225 16:12:39.155704 4937 generic.go:334] "Generic (PLEG): container finished" podID="fa840e84-cc33-4fed-9f62-50fc286001be" containerID="76045b66ae396f1673da50d35a9cdea3cc02e17bb8a1daaf6a0d60d35c0e65e4" exitCode=0 Feb 25 16:12:39 crc kubenswrapper[4937]: I0225 16:12:39.156150 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533932-w2rgh" event={"ID":"fa840e84-cc33-4fed-9f62-50fc286001be","Type":"ContainerDied","Data":"76045b66ae396f1673da50d35a9cdea3cc02e17bb8a1daaf6a0d60d35c0e65e4"} Feb 25 16:12:39 crc kubenswrapper[4937]: I0225 16:12:39.156173 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 25 16:12:39 crc kubenswrapper[4937]: I0225 16:12:39.157688 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 25 16:12:39 crc kubenswrapper[4937]: I0225 16:12:39.162259 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 25 16:12:39 crc kubenswrapper[4937]: I0225 16:12:39.206509 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 25 16:12:39 crc kubenswrapper[4937]: I0225 16:12:39.206918 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 25 16:12:39 crc kubenswrapper[4937]: I0225 16:12:39.229236 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 25 16:12:40 crc kubenswrapper[4937]: I0225 16:12:40.168367 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589f6455c9-dwk7g" event={"ID":"2a9fc39a-301c-48eb-8ae0-238271352711","Type":"ContainerStarted","Data":"76089335ac4fa0fe8873cf19d70b49ebd7b8a0769aafc4dc46234b25f3fdc49e"} Feb 25 16:12:40 crc kubenswrapper[4937]: I0225 16:12:40.171214 4937 generic.go:334] "Generic (PLEG): container finished" podID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerID="fd138b73791dbd412c40686640180fa4e8e5a765bff1a7628fbfdfa25ff4a202" exitCode=0 Feb 25 16:12:40 crc kubenswrapper[4937]: I0225 16:12:40.171335 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-754k4" event={"ID":"82cf475b-cc29-4d90-a1c4-73e0170f0f48","Type":"ContainerDied","Data":"fd138b73791dbd412c40686640180fa4e8e5a765bff1a7628fbfdfa25ff4a202"} Feb 25 16:12:40 crc kubenswrapper[4937]: I0225 16:12:40.172901 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 25 16:12:40 crc kubenswrapper[4937]: I0225 16:12:40.172938 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 25 16:12:40 crc kubenswrapper[4937]: I0225 16:12:40.172973 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 25 16:12:40 crc kubenswrapper[4937]: I0225 16:12:40.172985 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 25 16:12:41 crc kubenswrapper[4937]: I0225 16:12:41.497036 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:12:41 crc kubenswrapper[4937]: I0225 16:12:41.497566 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:12:43 crc kubenswrapper[4937]: I0225 16:12:43.934073 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533932-w2rgh" Feb 25 16:12:44 crc kubenswrapper[4937]: I0225 16:12:44.084992 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxc5s\" (UniqueName: \"kubernetes.io/projected/fa840e84-cc33-4fed-9f62-50fc286001be-kube-api-access-pxc5s\") pod \"fa840e84-cc33-4fed-9f62-50fc286001be\" (UID: \"fa840e84-cc33-4fed-9f62-50fc286001be\") " Feb 25 16:12:44 crc kubenswrapper[4937]: I0225 16:12:44.092677 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa840e84-cc33-4fed-9f62-50fc286001be-kube-api-access-pxc5s" (OuterVolumeSpecName: "kube-api-access-pxc5s") pod "fa840e84-cc33-4fed-9f62-50fc286001be" (UID: "fa840e84-cc33-4fed-9f62-50fc286001be"). InnerVolumeSpecName "kube-api-access-pxc5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:12:44 crc kubenswrapper[4937]: I0225 16:12:44.188032 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxc5s\" (UniqueName: \"kubernetes.io/projected/fa840e84-cc33-4fed-9f62-50fc286001be-kube-api-access-pxc5s\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:44 crc kubenswrapper[4937]: I0225 16:12:44.222447 4937 generic.go:334] "Generic (PLEG): container finished" podID="1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a" containerID="23cecae34658c8d20224bc287a30c3fe49fe078c805e9bce06b7cb26ec60abb9" exitCode=0 Feb 25 16:12:44 crc kubenswrapper[4937]: I0225 16:12:44.222546 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a","Type":"ContainerDied","Data":"23cecae34658c8d20224bc287a30c3fe49fe078c805e9bce06b7cb26ec60abb9"} Feb 25 16:12:44 crc kubenswrapper[4937]: I0225 16:12:44.228556 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7w4sz" event={"ID":"38a537ec-7743-44bd-b428-fa52adf39305","Type":"ContainerStarted","Data":"bac64cbf62bbf5c9cb7526af1128d38eef5d873e3a5f17b5965cad8e809efe3c"} Feb 25 16:12:44 crc kubenswrapper[4937]: I0225 16:12:44.236746 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533932-w2rgh" event={"ID":"fa840e84-cc33-4fed-9f62-50fc286001be","Type":"ContainerDied","Data":"3efe1b4622fd755b3226fe123c971036c3b3ee40c7217fe35149e316c55946c7"} Feb 25 16:12:44 crc kubenswrapper[4937]: I0225 16:12:44.236823 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3efe1b4622fd755b3226fe123c971036c3b3ee40c7217fe35149e316c55946c7" Feb 25 16:12:44 crc kubenswrapper[4937]: I0225 16:12:44.236899 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533932-w2rgh" Feb 25 16:12:44 crc kubenswrapper[4937]: I0225 16:12:44.303154 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-7w4sz" podStartSLOduration=3.165616235 podStartE2EDuration="58.303128166s" podCreationTimestamp="2026-02-25 16:11:46 +0000 UTC" firstStartedPulling="2026-02-25 16:11:48.620335256 +0000 UTC m=+1559.633727146" lastFinishedPulling="2026-02-25 16:12:43.757847187 +0000 UTC m=+1614.771239077" observedRunningTime="2026-02-25 16:12:44.282198192 +0000 UTC m=+1615.295590082" watchObservedRunningTime="2026-02-25 16:12:44.303128166 +0000 UTC m=+1615.316520056" Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.012878 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533926-mtjxm"] Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.023568 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533926-mtjxm"] Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.249083 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ef5e4a5-46f1-4f72-ab91-699865d33243","Type":"ContainerStarted","Data":"be8325fa6e888793158a1f475725134ab900615886cb7f8cc349d8d9d4c67032"} Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.251683 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589f6455c9-dwk7g" event={"ID":"2a9fc39a-301c-48eb-8ae0-238271352711","Type":"ContainerStarted","Data":"fdaa307ea492e884bba54e6a8e0d99cfc57123b3e821d9bc28312c86a1e72f1a"} Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.255952 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-754k4" event={"ID":"82cf475b-cc29-4d90-a1c4-73e0170f0f48","Type":"ContainerStarted","Data":"b1bc4e0f1cd6278c6a9db02c66208f3ed93fc844943a3b5938767508183297ad"} Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.260163 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a","Type":"ContainerStarted","Data":"2c14bb0d29bb786ff03ec021baf8a6717a67765a57e124a3e96ce41eb7b06e25"} Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.262043 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f988c32-d57e-4e63-add5-1e86a8818641" containerID="39c1d35910d9965962a2c3133843aac7413ceb0c3af56828f4d782dae8ca4270" exitCode=0 Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.262088 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8tmw5" event={"ID":"8f988c32-d57e-4e63-add5-1e86a8818641","Type":"ContainerDied","Data":"39c1d35910d9965962a2c3133843aac7413ceb0c3af56828f4d782dae8ca4270"} Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.291331 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-589f6455c9-dwk7g" podStartSLOduration=8.29131481 podStartE2EDuration="8.29131481s" podCreationTimestamp="2026-02-25 16:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:12:45.279679158 +0000 UTC m=+1616.293071048" watchObservedRunningTime="2026-02-25 16:12:45.29131481 +0000 UTC m=+1616.304706700" Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.306042 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-754k4" podStartSLOduration=30.614148567 podStartE2EDuration="40.306022878s" podCreationTimestamp="2026-02-25 16:12:05 +0000 UTC" firstStartedPulling="2026-02-25 16:12:34.801751367 +0000 UTC m=+1605.815143257" lastFinishedPulling="2026-02-25 16:12:44.493625678 +0000 UTC m=+1615.507017568" observedRunningTime="2026-02-25 16:12:45.300442249 +0000 UTC m=+1616.313834139" watchObservedRunningTime="2026-02-25 16:12:45.306022878 +0000 UTC m=+1616.319414768" Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.359254 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.385945 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e30f3c-7001-4619-9e1f-3ec0c825aed4" path="/var/lib/kubelet/pods/e9e30f3c-7001-4619-9e1f-3ec0c825aed4/volumes" Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.445032 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-9tdjb"] Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.445256 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" podUID="8aa18c37-6541-40d6-97b1-007a024605ec" containerName="dnsmasq-dns" containerID="cri-o://853b2165860ad0d6189ee4fb9db66b1cc68e7450c867432d6a46cf550228b6b2" gracePeriod=10 Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.971951 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:12:45 crc kubenswrapper[4937]: I0225 16:12:45.972501 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.293285 4937 generic.go:334] "Generic (PLEG): container finished" podID="8aa18c37-6541-40d6-97b1-007a024605ec" containerID="853b2165860ad0d6189ee4fb9db66b1cc68e7450c867432d6a46cf550228b6b2" exitCode=0 Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.293368 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" event={"ID":"8aa18c37-6541-40d6-97b1-007a024605ec","Type":"ContainerDied","Data":"853b2165860ad0d6189ee4fb9db66b1cc68e7450c867432d6a46cf550228b6b2"} Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.298303 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6rpk2" event={"ID":"006fb5e7-a244-4758-8065-3615f5a2b9b7","Type":"ContainerStarted","Data":"e1a6a643635b7ded6f035d06b0c8511691bcf92f830f74575dc22a413a153f75"} Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.300915 4937 generic.go:334] "Generic (PLEG): container finished" podID="69b3388f-2762-4ebe-a014-e1740aee3b66" containerID="24d96139ad175c0781598244dc1d9972d8a382730deb3a8b23a5a63248c6f156" exitCode=0 Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.301054 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ldjsj" event={"ID":"69b3388f-2762-4ebe-a014-e1740aee3b66","Type":"ContainerDied","Data":"24d96139ad175c0781598244dc1d9972d8a382730deb3a8b23a5a63248c6f156"} Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.302000 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.321113 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6rpk2" podStartSLOduration=3.986412068 podStartE2EDuration="1m0.321093885s" podCreationTimestamp="2026-02-25 16:11:46 +0000 UTC" firstStartedPulling="2026-02-25 16:11:48.125330779 +0000 UTC m=+1559.138722669" lastFinishedPulling="2026-02-25 16:12:44.460012596 +0000 UTC m=+1615.473404486" observedRunningTime="2026-02-25 16:12:46.315259138 +0000 UTC m=+1617.328651028" watchObservedRunningTime="2026-02-25 16:12:46.321093885 +0000 UTC m=+1617.334485785" Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.738600 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.862152 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68vbw\" (UniqueName: \"kubernetes.io/projected/8aa18c37-6541-40d6-97b1-007a024605ec-kube-api-access-68vbw\") pod \"8aa18c37-6541-40d6-97b1-007a024605ec\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.862554 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-ovsdbserver-sb\") pod \"8aa18c37-6541-40d6-97b1-007a024605ec\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.862611 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-config\") pod \"8aa18c37-6541-40d6-97b1-007a024605ec\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.862769 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-dns-svc\") pod \"8aa18c37-6541-40d6-97b1-007a024605ec\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.862857 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-ovsdbserver-nb\") pod \"8aa18c37-6541-40d6-97b1-007a024605ec\" (UID: \"8aa18c37-6541-40d6-97b1-007a024605ec\") " Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.903940 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aa18c37-6541-40d6-97b1-007a024605ec-kube-api-access-68vbw" (OuterVolumeSpecName: "kube-api-access-68vbw") pod "8aa18c37-6541-40d6-97b1-007a024605ec" (UID: "8aa18c37-6541-40d6-97b1-007a024605ec"). InnerVolumeSpecName "kube-api-access-68vbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.948138 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8aa18c37-6541-40d6-97b1-007a024605ec" (UID: "8aa18c37-6541-40d6-97b1-007a024605ec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.969840 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.969866 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68vbw\" (UniqueName: \"kubernetes.io/projected/8aa18c37-6541-40d6-97b1-007a024605ec-kube-api-access-68vbw\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.970901 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-config" (OuterVolumeSpecName: "config") pod "8aa18c37-6541-40d6-97b1-007a024605ec" (UID: "8aa18c37-6541-40d6-97b1-007a024605ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.975838 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8aa18c37-6541-40d6-97b1-007a024605ec" (UID: "8aa18c37-6541-40d6-97b1-007a024605ec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:12:46 crc kubenswrapper[4937]: I0225 16:12:46.995906 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8tmw5" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.034005 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-754k4" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="registry-server" probeResult="failure" output=< Feb 25 16:12:47 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 16:12:47 crc kubenswrapper[4937]: > Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.046264 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8aa18c37-6541-40d6-97b1-007a024605ec" (UID: "8aa18c37-6541-40d6-97b1-007a024605ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.075336 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-scripts\") pod \"8f988c32-d57e-4e63-add5-1e86a8818641\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.075455 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f988c32-d57e-4e63-add5-1e86a8818641-logs\") pod \"8f988c32-d57e-4e63-add5-1e86a8818641\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.079848 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f988c32-d57e-4e63-add5-1e86a8818641-logs" (OuterVolumeSpecName: "logs") pod "8f988c32-d57e-4e63-add5-1e86a8818641" (UID: "8f988c32-d57e-4e63-add5-1e86a8818641"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.081285 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r6xc\" (UniqueName: \"kubernetes.io/projected/8f988c32-d57e-4e63-add5-1e86a8818641-kube-api-access-6r6xc\") pod \"8f988c32-d57e-4e63-add5-1e86a8818641\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.081389 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-combined-ca-bundle\") pod \"8f988c32-d57e-4e63-add5-1e86a8818641\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.081652 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-config-data\") pod \"8f988c32-d57e-4e63-add5-1e86a8818641\" (UID: \"8f988c32-d57e-4e63-add5-1e86a8818641\") " Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.082895 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.082924 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.082937 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f988c32-d57e-4e63-add5-1e86a8818641-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.082947 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aa18c37-6541-40d6-97b1-007a024605ec-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.096185 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-scripts" (OuterVolumeSpecName: "scripts") pod "8f988c32-d57e-4e63-add5-1e86a8818641" (UID: "8f988c32-d57e-4e63-add5-1e86a8818641"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.121121 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f988c32-d57e-4e63-add5-1e86a8818641-kube-api-access-6r6xc" (OuterVolumeSpecName: "kube-api-access-6r6xc") pod "8f988c32-d57e-4e63-add5-1e86a8818641" (UID: "8f988c32-d57e-4e63-add5-1e86a8818641"). InnerVolumeSpecName "kube-api-access-6r6xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.141241 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f988c32-d57e-4e63-add5-1e86a8818641" (UID: "8f988c32-d57e-4e63-add5-1e86a8818641"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.154861 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-config-data" (OuterVolumeSpecName: "config-data") pod "8f988c32-d57e-4e63-add5-1e86a8818641" (UID: "8f988c32-d57e-4e63-add5-1e86a8818641"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.185834 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.185877 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.185895 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r6xc\" (UniqueName: \"kubernetes.io/projected/8f988c32-d57e-4e63-add5-1e86a8818641-kube-api-access-6r6xc\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.185909 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f988c32-d57e-4e63-add5-1e86a8818641-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.309708 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8tmw5" event={"ID":"8f988c32-d57e-4e63-add5-1e86a8818641","Type":"ContainerDied","Data":"1cd7f9326358cb1c4c96117b56a0d9c188c1e63bba939730c335a45ddd9ba232"} Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.309732 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8tmw5" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.309741 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cd7f9326358cb1c4c96117b56a0d9c188c1e63bba939730c335a45ddd9ba232" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.311699 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.324539 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-9tdjb" event={"ID":"8aa18c37-6541-40d6-97b1-007a024605ec","Type":"ContainerDied","Data":"eaf086c05cd4587e2aa4e67d0df0c7923001ac4d445fec99dcb2afa8c643215a"} Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.324918 4937 scope.go:117] "RemoveContainer" containerID="853b2165860ad0d6189ee4fb9db66b1cc68e7450c867432d6a46cf550228b6b2" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.412004 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-9tdjb"] Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.412243 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-9tdjb"] Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.549922 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5bc476bcbd-vwgwx"] Feb 25 16:12:47 crc kubenswrapper[4937]: E0225 16:12:47.550357 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa840e84-cc33-4fed-9f62-50fc286001be" containerName="oc" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.550373 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa840e84-cc33-4fed-9f62-50fc286001be" containerName="oc" Feb 25 16:12:47 crc kubenswrapper[4937]: E0225 16:12:47.550397 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aa18c37-6541-40d6-97b1-007a024605ec" containerName="dnsmasq-dns" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.550404 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa18c37-6541-40d6-97b1-007a024605ec" containerName="dnsmasq-dns" Feb 25 16:12:47 crc kubenswrapper[4937]: E0225 16:12:47.550412 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aa18c37-6541-40d6-97b1-007a024605ec" containerName="init" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.550418 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa18c37-6541-40d6-97b1-007a024605ec" containerName="init" Feb 25 16:12:47 crc kubenswrapper[4937]: E0225 16:12:47.550432 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f988c32-d57e-4e63-add5-1e86a8818641" containerName="placement-db-sync" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.550438 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f988c32-d57e-4e63-add5-1e86a8818641" containerName="placement-db-sync" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.550654 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aa18c37-6541-40d6-97b1-007a024605ec" containerName="dnsmasq-dns" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.550680 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa840e84-cc33-4fed-9f62-50fc286001be" containerName="oc" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.550690 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f988c32-d57e-4e63-add5-1e86a8818641" containerName="placement-db-sync" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.551692 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.560846 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bc476bcbd-vwgwx"] Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.563677 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.563898 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2lh9m" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.564306 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.563904 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.564437 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.694750 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-config-data\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.694788 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/443f8b26-eac8-403e-b311-07cf1cd7cb83-logs\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.694820 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-scripts\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.695036 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-internal-tls-certs\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.695353 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27wcd\" (UniqueName: \"kubernetes.io/projected/443f8b26-eac8-403e-b311-07cf1cd7cb83-kube-api-access-27wcd\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.695580 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-combined-ca-bundle\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.695607 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-public-tls-certs\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.797819 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-scripts\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.797934 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-internal-tls-certs\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.797982 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27wcd\" (UniqueName: \"kubernetes.io/projected/443f8b26-eac8-403e-b311-07cf1cd7cb83-kube-api-access-27wcd\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.798090 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-combined-ca-bundle\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.798115 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-public-tls-certs\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.798154 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-config-data\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.798179 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/443f8b26-eac8-403e-b311-07cf1cd7cb83-logs\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.798694 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/443f8b26-eac8-403e-b311-07cf1cd7cb83-logs\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.803526 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-internal-tls-certs\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.803734 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-config-data\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.803782 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-combined-ca-bundle\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.805009 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-public-tls-certs\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.813176 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-scripts\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.817579 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27wcd\" (UniqueName: \"kubernetes.io/projected/443f8b26-eac8-403e-b311-07cf1cd7cb83-kube-api-access-27wcd\") pod \"placement-5bc476bcbd-vwgwx\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.973447 4937 scope.go:117] "RemoveContainer" containerID="e7414a68c6f534c57664fa2f6642ceb9b67135598d756b32a16ce3b14b78a418" Feb 25 16:12:47 crc kubenswrapper[4937]: I0225 16:12:47.997309 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.094076 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.204834 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nppm2\" (UniqueName: \"kubernetes.io/projected/69b3388f-2762-4ebe-a014-e1740aee3b66-kube-api-access-nppm2\") pod \"69b3388f-2762-4ebe-a014-e1740aee3b66\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.205246 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-scripts\") pod \"69b3388f-2762-4ebe-a014-e1740aee3b66\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.205286 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-fernet-keys\") pod \"69b3388f-2762-4ebe-a014-e1740aee3b66\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.205359 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-config-data\") pod \"69b3388f-2762-4ebe-a014-e1740aee3b66\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.205421 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-credential-keys\") pod \"69b3388f-2762-4ebe-a014-e1740aee3b66\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.205495 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-combined-ca-bundle\") pod \"69b3388f-2762-4ebe-a014-e1740aee3b66\" (UID: \"69b3388f-2762-4ebe-a014-e1740aee3b66\") " Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.211025 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "69b3388f-2762-4ebe-a014-e1740aee3b66" (UID: "69b3388f-2762-4ebe-a014-e1740aee3b66"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.211037 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69b3388f-2762-4ebe-a014-e1740aee3b66-kube-api-access-nppm2" (OuterVolumeSpecName: "kube-api-access-nppm2") pod "69b3388f-2762-4ebe-a014-e1740aee3b66" (UID: "69b3388f-2762-4ebe-a014-e1740aee3b66"). InnerVolumeSpecName "kube-api-access-nppm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.211617 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-scripts" (OuterVolumeSpecName: "scripts") pod "69b3388f-2762-4ebe-a014-e1740aee3b66" (UID: "69b3388f-2762-4ebe-a014-e1740aee3b66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.212041 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "69b3388f-2762-4ebe-a014-e1740aee3b66" (UID: "69b3388f-2762-4ebe-a014-e1740aee3b66"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.231411 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-config-data" (OuterVolumeSpecName: "config-data") pod "69b3388f-2762-4ebe-a014-e1740aee3b66" (UID: "69b3388f-2762-4ebe-a014-e1740aee3b66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.233342 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69b3388f-2762-4ebe-a014-e1740aee3b66" (UID: "69b3388f-2762-4ebe-a014-e1740aee3b66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.307645 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.307686 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nppm2\" (UniqueName: \"kubernetes.io/projected/69b3388f-2762-4ebe-a014-e1740aee3b66-kube-api-access-nppm2\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.307702 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.307714 4937 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.307724 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.307733 4937 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69b3388f-2762-4ebe-a014-e1740aee3b66-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.324425 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ldjsj" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.324451 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ldjsj" event={"ID":"69b3388f-2762-4ebe-a014-e1740aee3b66","Type":"ContainerDied","Data":"46c0035f84f787435df418aab1c883e6c45db8e7804c218a77a868d711e7e7c6"} Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.324508 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46c0035f84f787435df418aab1c883e6c45db8e7804c218a77a868d711e7e7c6" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.477326 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7595479948-g6dtl"] Feb 25 16:12:48 crc kubenswrapper[4937]: E0225 16:12:48.477835 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69b3388f-2762-4ebe-a014-e1740aee3b66" containerName="keystone-bootstrap" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.477859 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="69b3388f-2762-4ebe-a014-e1740aee3b66" containerName="keystone-bootstrap" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.478087 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="69b3388f-2762-4ebe-a014-e1740aee3b66" containerName="keystone-bootstrap" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.478937 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.483386 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.483405 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d9p7q" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.483695 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.484740 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.485133 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.485241 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.495280 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bc476bcbd-vwgwx"] Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.521243 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7595479948-g6dtl"] Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.620351 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-fernet-keys\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.620720 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-combined-ca-bundle\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.620744 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lnqq\" (UniqueName: \"kubernetes.io/projected/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-kube-api-access-6lnqq\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.620775 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-internal-tls-certs\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.620828 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-credential-keys\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.620853 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-config-data\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.621007 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-scripts\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.621088 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-public-tls-certs\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.723294 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-config-data\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.723342 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-credential-keys\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.723397 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-scripts\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.723472 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-public-tls-certs\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.723629 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-fernet-keys\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.723655 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-combined-ca-bundle\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.723774 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-internal-tls-certs\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.724082 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lnqq\" (UniqueName: \"kubernetes.io/projected/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-kube-api-access-6lnqq\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.727727 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-internal-tls-certs\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.752900 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-fernet-keys\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.752986 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-config-data\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.753875 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-credential-keys\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.753964 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-public-tls-certs\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.754182 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-scripts\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.754289 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-combined-ca-bundle\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:48 crc kubenswrapper[4937]: I0225 16:12:48.847425 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lnqq\" (UniqueName: \"kubernetes.io/projected/e3d2f89f-c1af-45d5-bfdd-6f9c3141c124-kube-api-access-6lnqq\") pod \"keystone-7595479948-g6dtl\" (UID: \"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124\") " pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:49 crc kubenswrapper[4937]: I0225 16:12:49.119349 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:49 crc kubenswrapper[4937]: I0225 16:12:49.439850 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aa18c37-6541-40d6-97b1-007a024605ec" path="/var/lib/kubelet/pods/8aa18c37-6541-40d6-97b1-007a024605ec/volumes" Feb 25 16:12:49 crc kubenswrapper[4937]: I0225 16:12:49.440880 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bc476bcbd-vwgwx" event={"ID":"443f8b26-eac8-403e-b311-07cf1cd7cb83","Type":"ContainerStarted","Data":"03ecb7f01fc0cd1c9e12aaa6747d29e34de1ecbc9b48b2ce60eec63988b7f37e"} Feb 25 16:12:49 crc kubenswrapper[4937]: I0225 16:12:49.440909 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bc476bcbd-vwgwx" event={"ID":"443f8b26-eac8-403e-b311-07cf1cd7cb83","Type":"ContainerStarted","Data":"c3b2bce77205dd9e6fdf6556261c229c82b64a5eb3ef54e20adec1688fd896cc"} Feb 25 16:12:49 crc kubenswrapper[4937]: I0225 16:12:49.476848 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a","Type":"ContainerStarted","Data":"e53a338234bbff2506138bbd9a7af5c6b64eff232a3760370d1421a7ecb5e61d"} Feb 25 16:12:49 crc kubenswrapper[4937]: I0225 16:12:49.750207 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7595479948-g6dtl"] Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.494198 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-b4zhr" event={"ID":"44849697-9b41-4439-b8c7-f497036543aa","Type":"ContainerStarted","Data":"c425910de0a5e69eaff03516c0ca9937a3f17e41f9618d78a391bfc1c88e8e11"} Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.515925 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"94994478fa12f2b311e339866a0c6c486fb9a210536637e152a72c37076804d6"} Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.519825 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-b4zhr" podStartSLOduration=3.236320723 podStartE2EDuration="1m4.519811121s" podCreationTimestamp="2026-02-25 16:11:46 +0000 UTC" firstStartedPulling="2026-02-25 16:11:48.416913187 +0000 UTC m=+1559.430305087" lastFinishedPulling="2026-02-25 16:12:49.700403605 +0000 UTC m=+1620.713795485" observedRunningTime="2026-02-25 16:12:50.511687427 +0000 UTC m=+1621.525079317" watchObservedRunningTime="2026-02-25 16:12:50.519811121 +0000 UTC m=+1621.533203011" Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.541213 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.541722 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bc476bcbd-vwgwx" event={"ID":"443f8b26-eac8-403e-b311-07cf1cd7cb83","Type":"ContainerStarted","Data":"78644883a09ab939d41e6f50c4dfff46be349cce6856355a8ab86ae21f7e9023"} Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.542621 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.549226 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7595479948-g6dtl" event={"ID":"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124","Type":"ContainerStarted","Data":"1b9ac140b6cae628c8382b859553b3004dd166c8c5c0a07bbf0b2c4213d02f1c"} Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.549276 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7595479948-g6dtl" event={"ID":"e3d2f89f-c1af-45d5-bfdd-6f9c3141c124","Type":"ContainerStarted","Data":"25dcc6517d0786531e327e4922545a73963f9bdbdbbe8390f038c58864645c3f"} Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.549577 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.560812 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.565481 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.565605 4937 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.568076 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a","Type":"ContainerStarted","Data":"c783b2f94dbd0137d3ac6c4feaa0b28d644ff9eb41f17f4b8d5d3d52a54b9b43"} Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.584564 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.630828 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7595479948-g6dtl" podStartSLOduration=2.630780201 podStartE2EDuration="2.630780201s" podCreationTimestamp="2026-02-25 16:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:12:50.601869246 +0000 UTC m=+1621.615261136" watchObservedRunningTime="2026-02-25 16:12:50.630780201 +0000 UTC m=+1621.644172101" Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.808096 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5bc476bcbd-vwgwx" podStartSLOduration=3.808074582 podStartE2EDuration="3.808074582s" podCreationTimestamp="2026-02-25 16:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:12:50.793742603 +0000 UTC m=+1621.807134493" watchObservedRunningTime="2026-02-25 16:12:50.808074582 +0000 UTC m=+1621.821466472" Feb 25 16:12:50 crc kubenswrapper[4937]: I0225 16:12:50.891511 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=27.891473631 podStartE2EDuration="27.891473631s" podCreationTimestamp="2026-02-25 16:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:12:50.865356597 +0000 UTC m=+1621.878748487" watchObservedRunningTime="2026-02-25 16:12:50.891473631 +0000 UTC m=+1621.904865521" Feb 25 16:12:51 crc kubenswrapper[4937]: I0225 16:12:51.621383 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"2b55197768e971b674c6fceff9d26bc88e11531bf36758ddfd76188949bcebf9"} Feb 25 16:12:51 crc kubenswrapper[4937]: I0225 16:12:51.621676 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"128243d3a6362fa72b79a0f12ce5373f2cde76bdb2d4287f9e0e648f2e6d09e2"} Feb 25 16:12:51 crc kubenswrapper[4937]: I0225 16:12:51.621688 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"3723cd4af78da748df297fa972fae254fff9bd21e89247dff223aa73fa91d5e1"} Feb 25 16:12:51 crc kubenswrapper[4937]: I0225 16:12:51.623457 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.535322 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-575b75bdd-mz6p6"] Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.537705 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.548332 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-575b75bdd-mz6p6"] Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.583621 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-public-tls-certs\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.583690 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-logs\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.583712 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-config-data\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.583767 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-internal-tls-certs\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.583803 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-combined-ca-bundle\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.583823 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-scripts\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.583846 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngsmx\" (UniqueName: \"kubernetes.io/projected/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-kube-api-access-ngsmx\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.645534 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"93b240672c70137def0b7210b8819c6d736b6d1ed4ca044935cbf649992dde28"} Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.645581 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"48d22af0-5579-46fb-889d-fd34e46d26e9","Type":"ContainerStarted","Data":"6c339da7db6719ca2ce97e23c9478c99607a89fdc2cb52d741698081b53706eb"} Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.681987 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=68.947603933 podStartE2EDuration="2m14.681951352s" podCreationTimestamp="2026-02-25 16:10:38 +0000 UTC" firstStartedPulling="2026-02-25 16:11:44.088358529 +0000 UTC m=+1555.101750409" lastFinishedPulling="2026-02-25 16:12:49.822705938 +0000 UTC m=+1620.836097828" observedRunningTime="2026-02-25 16:12:52.67745076 +0000 UTC m=+1623.690842660" watchObservedRunningTime="2026-02-25 16:12:52.681951352 +0000 UTC m=+1623.695343242" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.685328 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-combined-ca-bundle\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.685380 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-scripts\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.685411 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngsmx\" (UniqueName: \"kubernetes.io/projected/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-kube-api-access-ngsmx\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.685497 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-public-tls-certs\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.685536 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-logs\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.685556 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-config-data\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.685613 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-internal-tls-certs\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.686294 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-logs\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.690101 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-internal-tls-certs\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.693887 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-config-data\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.695814 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-scripts\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.704742 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-combined-ca-bundle\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.705016 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-public-tls-certs\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.711637 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngsmx\" (UniqueName: \"kubernetes.io/projected/b9abdfa0-773f-4d50-ae2d-8d7a429b5df7-kube-api-access-ngsmx\") pod \"placement-575b75bdd-mz6p6\" (UID: \"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7\") " pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:52 crc kubenswrapper[4937]: I0225 16:12:52.857764 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.104830 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mnht5"] Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.106792 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.111572 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.131173 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mnht5"] Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.193610 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.193804 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-config\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.193893 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.210749 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-dns-svc\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.211041 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jdvk\" (UniqueName: \"kubernetes.io/projected/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-kube-api-access-7jdvk\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.211146 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.312445 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-dns-svc\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.312513 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jdvk\" (UniqueName: \"kubernetes.io/projected/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-kube-api-access-7jdvk\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.312546 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.312625 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.312688 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-config\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.312716 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.313286 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-dns-svc\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.313844 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.313869 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.314433 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-config\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.317120 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.339746 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jdvk\" (UniqueName: \"kubernetes.io/projected/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-kube-api-access-7jdvk\") pod \"dnsmasq-dns-55f844cf75-mnht5\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.468806 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.612859 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-575b75bdd-mz6p6"] Feb 25 16:12:53 crc kubenswrapper[4937]: W0225 16:12:53.635566 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9abdfa0_773f_4d50_ae2d_8d7a429b5df7.slice/crio-684bc6d993fd98a124e92b8724f9025e30b950e5ad50272464423388c554f4c4 WatchSource:0}: Error finding container 684bc6d993fd98a124e92b8724f9025e30b950e5ad50272464423388c554f4c4: Status 404 returned error can't find the container with id 684bc6d993fd98a124e92b8724f9025e30b950e5ad50272464423388c554f4c4 Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.667022 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-575b75bdd-mz6p6" event={"ID":"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7","Type":"ContainerStarted","Data":"684bc6d993fd98a124e92b8724f9025e30b950e5ad50272464423388c554f4c4"} Feb 25 16:12:53 crc kubenswrapper[4937]: I0225 16:12:53.934300 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mnht5"] Feb 25 16:12:54 crc kubenswrapper[4937]: I0225 16:12:54.356222 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:54 crc kubenswrapper[4937]: I0225 16:12:54.356554 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:54 crc kubenswrapper[4937]: I0225 16:12:54.364007 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:54 crc kubenswrapper[4937]: I0225 16:12:54.679271 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-575b75bdd-mz6p6" event={"ID":"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7","Type":"ContainerStarted","Data":"8887271f2128ee926dd5996ab25e6bba279d849f9f58f58a09e0b5b06196f7b0"} Feb 25 16:12:54 crc kubenswrapper[4937]: I0225 16:12:54.680521 4937 generic.go:334] "Generic (PLEG): container finished" podID="bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" containerID="aafa5de31067549491821868bb9690f582a869a3ae87fcc3604bf8f575382849" exitCode=0 Feb 25 16:12:54 crc kubenswrapper[4937]: I0225 16:12:54.682212 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mnht5" event={"ID":"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2","Type":"ContainerDied","Data":"aafa5de31067549491821868bb9690f582a869a3ae87fcc3604bf8f575382849"} Feb 25 16:12:54 crc kubenswrapper[4937]: I0225 16:12:54.682242 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mnht5" event={"ID":"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2","Type":"ContainerStarted","Data":"10b59577f3be35222ce77a8233037a286ab5a03d2ebbbaadf85bc95a759b09ac"} Feb 25 16:12:54 crc kubenswrapper[4937]: I0225 16:12:54.686054 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 25 16:12:55 crc kubenswrapper[4937]: I0225 16:12:55.709247 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-575b75bdd-mz6p6" event={"ID":"b9abdfa0-773f-4d50-ae2d-8d7a429b5df7","Type":"ContainerStarted","Data":"dd3ef4e6eaa6a1693c300f8d2cb237bec889d0bcbc9b89301527b0dd3d42fef9"} Feb 25 16:12:55 crc kubenswrapper[4937]: I0225 16:12:55.716788 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:55 crc kubenswrapper[4937]: I0225 16:12:55.716842 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:12:55 crc kubenswrapper[4937]: I0225 16:12:55.719709 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mnht5" event={"ID":"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2","Type":"ContainerStarted","Data":"6271ee23a9a65220e71abb8a77f63bc4f467c985d3b03864e9446c4fbf622ec2"} Feb 25 16:12:55 crc kubenswrapper[4937]: I0225 16:12:55.719905 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:12:55 crc kubenswrapper[4937]: I0225 16:12:55.748932 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-575b75bdd-mz6p6" podStartSLOduration=3.7489072180000003 podStartE2EDuration="3.748907218s" podCreationTimestamp="2026-02-25 16:12:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:12:55.734396614 +0000 UTC m=+1626.747788514" watchObservedRunningTime="2026-02-25 16:12:55.748907218 +0000 UTC m=+1626.762299108" Feb 25 16:12:55 crc kubenswrapper[4937]: I0225 16:12:55.764121 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-mnht5" podStartSLOduration=2.764099269 podStartE2EDuration="2.764099269s" podCreationTimestamp="2026-02-25 16:12:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:12:55.758393176 +0000 UTC m=+1626.771785066" watchObservedRunningTime="2026-02-25 16:12:55.764099269 +0000 UTC m=+1626.777491159" Feb 25 16:12:57 crc kubenswrapper[4937]: I0225 16:12:57.028947 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-754k4" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="registry-server" probeResult="failure" output=< Feb 25 16:12:57 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 16:12:57 crc kubenswrapper[4937]: > Feb 25 16:12:58 crc kubenswrapper[4937]: I0225 16:12:58.781509 4937 generic.go:334] "Generic (PLEG): container finished" podID="38a537ec-7743-44bd-b428-fa52adf39305" containerID="bac64cbf62bbf5c9cb7526af1128d38eef5d873e3a5f17b5965cad8e809efe3c" exitCode=0 Feb 25 16:12:58 crc kubenswrapper[4937]: I0225 16:12:58.781612 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7w4sz" event={"ID":"38a537ec-7743-44bd-b428-fa52adf39305","Type":"ContainerDied","Data":"bac64cbf62bbf5c9cb7526af1128d38eef5d873e3a5f17b5965cad8e809efe3c"} Feb 25 16:13:00 crc kubenswrapper[4937]: I0225 16:13:00.641773 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7w4sz" Feb 25 16:13:00 crc kubenswrapper[4937]: I0225 16:13:00.682945 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a537ec-7743-44bd-b428-fa52adf39305-combined-ca-bundle\") pod \"38a537ec-7743-44bd-b428-fa52adf39305\" (UID: \"38a537ec-7743-44bd-b428-fa52adf39305\") " Feb 25 16:13:00 crc kubenswrapper[4937]: I0225 16:13:00.683389 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5cb6\" (UniqueName: \"kubernetes.io/projected/38a537ec-7743-44bd-b428-fa52adf39305-kube-api-access-z5cb6\") pod \"38a537ec-7743-44bd-b428-fa52adf39305\" (UID: \"38a537ec-7743-44bd-b428-fa52adf39305\") " Feb 25 16:13:00 crc kubenswrapper[4937]: I0225 16:13:00.683439 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38a537ec-7743-44bd-b428-fa52adf39305-db-sync-config-data\") pod \"38a537ec-7743-44bd-b428-fa52adf39305\" (UID: \"38a537ec-7743-44bd-b428-fa52adf39305\") " Feb 25 16:13:00 crc kubenswrapper[4937]: I0225 16:13:00.695756 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a537ec-7743-44bd-b428-fa52adf39305-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "38a537ec-7743-44bd-b428-fa52adf39305" (UID: "38a537ec-7743-44bd-b428-fa52adf39305"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:00 crc kubenswrapper[4937]: I0225 16:13:00.695861 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a537ec-7743-44bd-b428-fa52adf39305-kube-api-access-z5cb6" (OuterVolumeSpecName: "kube-api-access-z5cb6") pod "38a537ec-7743-44bd-b428-fa52adf39305" (UID: "38a537ec-7743-44bd-b428-fa52adf39305"). InnerVolumeSpecName "kube-api-access-z5cb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:00 crc kubenswrapper[4937]: I0225 16:13:00.721590 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a537ec-7743-44bd-b428-fa52adf39305-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38a537ec-7743-44bd-b428-fa52adf39305" (UID: "38a537ec-7743-44bd-b428-fa52adf39305"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:00 crc kubenswrapper[4937]: I0225 16:13:00.786258 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5cb6\" (UniqueName: \"kubernetes.io/projected/38a537ec-7743-44bd-b428-fa52adf39305-kube-api-access-z5cb6\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:00 crc kubenswrapper[4937]: I0225 16:13:00.786299 4937 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38a537ec-7743-44bd-b428-fa52adf39305-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:00 crc kubenswrapper[4937]: I0225 16:13:00.786308 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a537ec-7743-44bd-b428-fa52adf39305-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:00 crc kubenswrapper[4937]: I0225 16:13:00.805212 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7w4sz" event={"ID":"38a537ec-7743-44bd-b428-fa52adf39305","Type":"ContainerDied","Data":"7c642085466985ced2df8bee6d25317f1128749d0efbef394ae553a98c3d1e76"} Feb 25 16:13:00 crc kubenswrapper[4937]: I0225 16:13:00.805265 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c642085466985ced2df8bee6d25317f1128749d0efbef394ae553a98c3d1e76" Feb 25 16:13:00 crc kubenswrapper[4937]: I0225 16:13:00.805320 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7w4sz" Feb 25 16:13:00 crc kubenswrapper[4937]: E0225 16:13:00.959885 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="8ef5e4a5-46f1-4f72-ab91-699865d33243" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.145706 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6b99d98bc-2r54q"] Feb 25 16:13:01 crc kubenswrapper[4937]: E0225 16:13:01.146399 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a537ec-7743-44bd-b428-fa52adf39305" containerName="barbican-db-sync" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.146421 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a537ec-7743-44bd-b428-fa52adf39305" containerName="barbican-db-sync" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.146642 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a537ec-7743-44bd-b428-fa52adf39305" containerName="barbican-db-sync" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.148074 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.158890 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5w7rh" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.159120 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.159739 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.182558 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b99d98bc-2r54q"] Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.194350 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394cbe6e-1697-449d-abaf-68e9ba275096-logs\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.205378 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6b9499bbcd-kr7kb"] Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.207664 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.210901 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.210981 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394cbe6e-1697-449d-abaf-68e9ba275096-config-data\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.211057 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmjgl\" (UniqueName: \"kubernetes.io/projected/394cbe6e-1697-449d-abaf-68e9ba275096-kube-api-access-lmjgl\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.211118 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/394cbe6e-1697-449d-abaf-68e9ba275096-config-data-custom\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.211510 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394cbe6e-1697-449d-abaf-68e9ba275096-combined-ca-bundle\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.259226 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b9499bbcd-kr7kb"] Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.314351 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmjgl\" (UniqueName: \"kubernetes.io/projected/394cbe6e-1697-449d-abaf-68e9ba275096-kube-api-access-lmjgl\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.314955 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/394cbe6e-1697-449d-abaf-68e9ba275096-config-data-custom\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.315056 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-logs\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.315079 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394cbe6e-1697-449d-abaf-68e9ba275096-combined-ca-bundle\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.315131 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4rnl\" (UniqueName: \"kubernetes.io/projected/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-kube-api-access-c4rnl\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.315156 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-config-data\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.315186 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-combined-ca-bundle\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.315208 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394cbe6e-1697-449d-abaf-68e9ba275096-logs\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.315237 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-config-data-custom\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.315328 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394cbe6e-1697-449d-abaf-68e9ba275096-config-data\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.319187 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394cbe6e-1697-449d-abaf-68e9ba275096-logs\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.320357 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mnht5"] Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.326777 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-mnht5" podUID="bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" containerName="dnsmasq-dns" containerID="cri-o://6271ee23a9a65220e71abb8a77f63bc4f467c985d3b03864e9446c4fbf622ec2" gracePeriod=10 Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.332689 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.335963 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394cbe6e-1697-449d-abaf-68e9ba275096-config-data\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.336400 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/394cbe6e-1697-449d-abaf-68e9ba275096-config-data-custom\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.337239 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394cbe6e-1697-449d-abaf-68e9ba275096-combined-ca-bundle\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.358705 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-69bf8b4fb-nfx4x"] Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.360331 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.366005 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.370524 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmjgl\" (UniqueName: \"kubernetes.io/projected/394cbe6e-1697-449d-abaf-68e9ba275096-kube-api-access-lmjgl\") pod \"barbican-worker-6b99d98bc-2r54q\" (UID: \"394cbe6e-1697-449d-abaf-68e9ba275096\") " pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.418546 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vdxzb"] Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.444954 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-config-data-custom\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.445075 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-logs\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.445232 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwlgb\" (UniqueName: \"kubernetes.io/projected/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-kube-api-access-mwlgb\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.445263 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-config-data\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.445683 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-combined-ca-bundle\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.445784 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-logs\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.445992 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4rnl\" (UniqueName: \"kubernetes.io/projected/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-kube-api-access-c4rnl\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.446026 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-config-data-custom\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.446239 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-config-data\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.446281 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-combined-ca-bundle\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.470857 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69bf8b4fb-nfx4x"] Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.470933 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vdxzb"] Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.471064 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.493415 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-logs\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.495506 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b99d98bc-2r54q" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.504446 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-config-data-custom\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.508333 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-combined-ca-bundle\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.524149 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4rnl\" (UniqueName: \"kubernetes.io/projected/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-kube-api-access-c4rnl\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.558074 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6520b70f-9bf6-4b3c-ad1e-4f43da8daec5-config-data\") pod \"barbican-keystone-listener-6b9499bbcd-kr7kb\" (UID: \"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5\") " pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.598997 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-combined-ca-bundle\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.606520 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.606573 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.606742 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-config-data-custom\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.606768 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.606348 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-combined-ca-bundle\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.613746 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-logs\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.613875 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffr8d\" (UniqueName: \"kubernetes.io/projected/32f1341d-07b8-4522-8541-c01f9e9ce74a-kube-api-access-ffr8d\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.613953 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwlgb\" (UniqueName: \"kubernetes.io/projected/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-kube-api-access-mwlgb\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.613988 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-config-data\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.614080 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-config\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.614099 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-dns-svc\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.614655 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-logs\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.626661 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-config-data-custom\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.631776 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-config-data\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.649964 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwlgb\" (UniqueName: \"kubernetes.io/projected/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-kube-api-access-mwlgb\") pod \"barbican-api-69bf8b4fb-nfx4x\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.715672 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffr8d\" (UniqueName: \"kubernetes.io/projected/32f1341d-07b8-4522-8541-c01f9e9ce74a-kube-api-access-ffr8d\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.716047 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-config\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.716063 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-dns-svc\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.716111 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.716134 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.716173 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.717032 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.717884 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-config\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.718403 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-dns-svc\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.719995 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.728334 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.729155 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.751167 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffr8d\" (UniqueName: \"kubernetes.io/projected/32f1341d-07b8-4522-8541-c01f9e9ce74a-kube-api-access-ffr8d\") pod \"dnsmasq-dns-85ff748b95-vdxzb\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.827054 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.854926 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.920058 4937 generic.go:334] "Generic (PLEG): container finished" podID="bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" containerID="6271ee23a9a65220e71abb8a77f63bc4f467c985d3b03864e9446c4fbf622ec2" exitCode=0 Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.920169 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mnht5" event={"ID":"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2","Type":"ContainerDied","Data":"6271ee23a9a65220e71abb8a77f63bc4f467c985d3b03864e9446c4fbf622ec2"} Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.941791 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ef5e4a5-46f1-4f72-ab91-699865d33243","Type":"ContainerStarted","Data":"8b4a77614d7a0cf6d8e9ddcef449aae52cf40dd692c6145777ac09c87acce818"} Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.941968 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerName="ceilometer-notification-agent" containerID="cri-o://975bf9c7209b653fdf9d6a2d25684b654b89782f6a2efef799f7e3f9f4ae31b7" gracePeriod=30 Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.942274 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.942571 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerName="proxy-httpd" containerID="cri-o://8b4a77614d7a0cf6d8e9ddcef449aae52cf40dd692c6145777ac09c87acce818" gracePeriod=30 Feb 25 16:13:01 crc kubenswrapper[4937]: I0225 16:13:01.942624 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerName="sg-core" containerID="cri-o://be8325fa6e888793158a1f475725134ab900615886cb7f8cc349d8d9d4c67032" gracePeriod=30 Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.237223 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.346330 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-config\") pod \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.346399 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-dns-svc\") pod \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.346522 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-ovsdbserver-sb\") pod \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.346626 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-dns-swift-storage-0\") pod \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.346730 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-ovsdbserver-nb\") pod \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.346783 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jdvk\" (UniqueName: \"kubernetes.io/projected/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-kube-api-access-7jdvk\") pod \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\" (UID: \"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2\") " Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.362866 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-kube-api-access-7jdvk" (OuterVolumeSpecName: "kube-api-access-7jdvk") pod "bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" (UID: "bc3e04ed-6687-4136-b1bc-b4c19bcb26a2"). InnerVolumeSpecName "kube-api-access-7jdvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.412801 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b99d98bc-2r54q"] Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.434379 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" (UID: "bc3e04ed-6687-4136-b1bc-b4c19bcb26a2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.443736 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" (UID: "bc3e04ed-6687-4136-b1bc-b4c19bcb26a2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.451787 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.451813 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jdvk\" (UniqueName: \"kubernetes.io/projected/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-kube-api-access-7jdvk\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.451822 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.462296 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-config" (OuterVolumeSpecName: "config") pod "bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" (UID: "bc3e04ed-6687-4136-b1bc-b4c19bcb26a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.465907 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" (UID: "bc3e04ed-6687-4136-b1bc-b4c19bcb26a2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.473809 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" (UID: "bc3e04ed-6687-4136-b1bc-b4c19bcb26a2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.553701 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.553731 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.553741 4937 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.645156 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b9499bbcd-kr7kb"] Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.668115 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vdxzb"] Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.859793 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69bf8b4fb-nfx4x"] Feb 25 16:13:02 crc kubenswrapper[4937]: W0225 16:13:02.868750 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ba5074b_5bcc_4028_9fe9_7ea203f32b25.slice/crio-f6b2607ac85e9b653ed3d4f242c71b568d525aa1ca72ebcfade6b7ac8809b92e WatchSource:0}: Error finding container f6b2607ac85e9b653ed3d4f242c71b568d525aa1ca72ebcfade6b7ac8809b92e: Status 404 returned error can't find the container with id f6b2607ac85e9b653ed3d4f242c71b568d525aa1ca72ebcfade6b7ac8809b92e Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.956760 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" event={"ID":"32f1341d-07b8-4522-8541-c01f9e9ce74a","Type":"ContainerStarted","Data":"0c2e732b702e521f8c983d9001fda18bc4b5480e98d67a645a4e4986b705a835"} Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.966922 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" event={"ID":"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5","Type":"ContainerStarted","Data":"64e3d83378862f273e8c7243d58f965e9e51c300a513ea2af2d334850c937313"} Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.969571 4937 generic.go:334] "Generic (PLEG): container finished" podID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerID="8b4a77614d7a0cf6d8e9ddcef449aae52cf40dd692c6145777ac09c87acce818" exitCode=0 Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.969603 4937 generic.go:334] "Generic (PLEG): container finished" podID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerID="be8325fa6e888793158a1f475725134ab900615886cb7f8cc349d8d9d4c67032" exitCode=2 Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.969633 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ef5e4a5-46f1-4f72-ab91-699865d33243","Type":"ContainerDied","Data":"8b4a77614d7a0cf6d8e9ddcef449aae52cf40dd692c6145777ac09c87acce818"} Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.969651 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ef5e4a5-46f1-4f72-ab91-699865d33243","Type":"ContainerDied","Data":"be8325fa6e888793158a1f475725134ab900615886cb7f8cc349d8d9d4c67032"} Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.971274 4937 generic.go:334] "Generic (PLEG): container finished" podID="006fb5e7-a244-4758-8065-3615f5a2b9b7" containerID="e1a6a643635b7ded6f035d06b0c8511691bcf92f830f74575dc22a413a153f75" exitCode=0 Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.971310 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6rpk2" event={"ID":"006fb5e7-a244-4758-8065-3615f5a2b9b7","Type":"ContainerDied","Data":"e1a6a643635b7ded6f035d06b0c8511691bcf92f830f74575dc22a413a153f75"} Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.972843 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69bf8b4fb-nfx4x" event={"ID":"8ba5074b-5bcc-4028-9fe9-7ea203f32b25","Type":"ContainerStarted","Data":"f6b2607ac85e9b653ed3d4f242c71b568d525aa1ca72ebcfade6b7ac8809b92e"} Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.974037 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b99d98bc-2r54q" event={"ID":"394cbe6e-1697-449d-abaf-68e9ba275096","Type":"ContainerStarted","Data":"bc1e87ac4c449d1c79e615986626efc3e857d22a9fbf3fb50e0c67fea9d5efd5"} Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.977348 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mnht5" event={"ID":"bc3e04ed-6687-4136-b1bc-b4c19bcb26a2","Type":"ContainerDied","Data":"10b59577f3be35222ce77a8233037a286ab5a03d2ebbbaadf85bc95a759b09ac"} Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.977383 4937 scope.go:117] "RemoveContainer" containerID="6271ee23a9a65220e71abb8a77f63bc4f467c985d3b03864e9446c4fbf622ec2" Feb 25 16:13:02 crc kubenswrapper[4937]: I0225 16:13:02.977410 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mnht5" Feb 25 16:13:03 crc kubenswrapper[4937]: I0225 16:13:03.021278 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mnht5"] Feb 25 16:13:03 crc kubenswrapper[4937]: I0225 16:13:03.028106 4937 scope.go:117] "RemoveContainer" containerID="aafa5de31067549491821868bb9690f582a869a3ae87fcc3604bf8f575382849" Feb 25 16:13:03 crc kubenswrapper[4937]: I0225 16:13:03.030221 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mnht5"] Feb 25 16:13:03 crc kubenswrapper[4937]: I0225 16:13:03.385773 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" path="/var/lib/kubelet/pods/bc3e04ed-6687-4136-b1bc-b4c19bcb26a2/volumes" Feb 25 16:13:03 crc kubenswrapper[4937]: I0225 16:13:03.993303 4937 generic.go:334] "Generic (PLEG): container finished" podID="44849697-9b41-4439-b8c7-f497036543aa" containerID="c425910de0a5e69eaff03516c0ca9937a3f17e41f9618d78a391bfc1c88e8e11" exitCode=0 Feb 25 16:13:03 crc kubenswrapper[4937]: I0225 16:13:03.993376 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-b4zhr" event={"ID":"44849697-9b41-4439-b8c7-f497036543aa","Type":"ContainerDied","Data":"c425910de0a5e69eaff03516c0ca9937a3f17e41f9618d78a391bfc1c88e8e11"} Feb 25 16:13:03 crc kubenswrapper[4937]: I0225 16:13:03.995242 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69bf8b4fb-nfx4x" event={"ID":"8ba5074b-5bcc-4028-9fe9-7ea203f32b25","Type":"ContainerStarted","Data":"a453a521d7da1667622e4fa12753c941359a7de17fd65b93adc1a0a83ecd77e1"} Feb 25 16:13:03 crc kubenswrapper[4937]: I0225 16:13:03.995278 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69bf8b4fb-nfx4x" event={"ID":"8ba5074b-5bcc-4028-9fe9-7ea203f32b25","Type":"ContainerStarted","Data":"17dcd3c571d10ecb5fbeba2d96fa4f01b7d503a85b38e859b46e71d755d087e5"} Feb 25 16:13:03 crc kubenswrapper[4937]: I0225 16:13:03.995302 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:03 crc kubenswrapper[4937]: I0225 16:13:03.995331 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.010125 4937 generic.go:334] "Generic (PLEG): container finished" podID="32f1341d-07b8-4522-8541-c01f9e9ce74a" containerID="eaf7f5567bab5c5686a976e8d10b599dc3fa8260408f62be89c0e93ba224c394" exitCode=0 Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.011121 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" event={"ID":"32f1341d-07b8-4522-8541-c01f9e9ce74a","Type":"ContainerDied","Data":"eaf7f5567bab5c5686a976e8d10b599dc3fa8260408f62be89c0e93ba224c394"} Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.102565 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-69bf8b4fb-nfx4x" podStartSLOduration=3.102541382 podStartE2EDuration="3.102541382s" podCreationTimestamp="2026-02-25 16:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:04.0892777 +0000 UTC m=+1635.102669580" watchObservedRunningTime="2026-02-25 16:13:04.102541382 +0000 UTC m=+1635.115933272" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.200281 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6c757b5c5d-sqs2g"] Feb 25 16:13:04 crc kubenswrapper[4937]: E0225 16:13:04.200712 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" containerName="dnsmasq-dns" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.200729 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" containerName="dnsmasq-dns" Feb 25 16:13:04 crc kubenswrapper[4937]: E0225 16:13:04.200752 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" containerName="init" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.200759 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" containerName="init" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.200933 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc3e04ed-6687-4136-b1bc-b4c19bcb26a2" containerName="dnsmasq-dns" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.202007 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.205067 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.205238 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.211737 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c757b5c5d-sqs2g"] Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.301986 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nldfk\" (UniqueName: \"kubernetes.io/projected/11e5858d-ee0a-4f76-8863-25be5ef4df36-kube-api-access-nldfk\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.302041 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-internal-tls-certs\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.302133 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-public-tls-certs\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.302165 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11e5858d-ee0a-4f76-8863-25be5ef4df36-logs\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.302212 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-config-data\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.302241 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-combined-ca-bundle\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.302262 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-config-data-custom\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.403868 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-internal-tls-certs\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.403980 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-public-tls-certs\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.404552 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11e5858d-ee0a-4f76-8863-25be5ef4df36-logs\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.404637 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-config-data\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.404687 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-combined-ca-bundle\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.404735 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-config-data-custom\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.404844 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nldfk\" (UniqueName: \"kubernetes.io/projected/11e5858d-ee0a-4f76-8863-25be5ef4df36-kube-api-access-nldfk\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.409512 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11e5858d-ee0a-4f76-8863-25be5ef4df36-logs\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.410377 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-public-tls-certs\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.411861 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-internal-tls-certs\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.414612 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-config-data-custom\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.415889 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-combined-ca-bundle\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.426571 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11e5858d-ee0a-4f76-8863-25be5ef4df36-config-data\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.435192 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nldfk\" (UniqueName: \"kubernetes.io/projected/11e5858d-ee0a-4f76-8863-25be5ef4df36-kube-api-access-nldfk\") pod \"barbican-api-6c757b5c5d-sqs2g\" (UID: \"11e5858d-ee0a-4f76-8863-25be5ef4df36\") " pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.531668 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.877092 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.923138 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-config-data\") pod \"006fb5e7-a244-4758-8065-3615f5a2b9b7\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.923217 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp2c7\" (UniqueName: \"kubernetes.io/projected/006fb5e7-a244-4758-8065-3615f5a2b9b7-kube-api-access-zp2c7\") pod \"006fb5e7-a244-4758-8065-3615f5a2b9b7\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.923313 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-combined-ca-bundle\") pod \"006fb5e7-a244-4758-8065-3615f5a2b9b7\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.923377 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/006fb5e7-a244-4758-8065-3615f5a2b9b7-etc-machine-id\") pod \"006fb5e7-a244-4758-8065-3615f5a2b9b7\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.923537 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-scripts\") pod \"006fb5e7-a244-4758-8065-3615f5a2b9b7\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.923578 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-db-sync-config-data\") pod \"006fb5e7-a244-4758-8065-3615f5a2b9b7\" (UID: \"006fb5e7-a244-4758-8065-3615f5a2b9b7\") " Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.926223 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/006fb5e7-a244-4758-8065-3615f5a2b9b7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "006fb5e7-a244-4758-8065-3615f5a2b9b7" (UID: "006fb5e7-a244-4758-8065-3615f5a2b9b7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.932815 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "006fb5e7-a244-4758-8065-3615f5a2b9b7" (UID: "006fb5e7-a244-4758-8065-3615f5a2b9b7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.933386 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/006fb5e7-a244-4758-8065-3615f5a2b9b7-kube-api-access-zp2c7" (OuterVolumeSpecName: "kube-api-access-zp2c7") pod "006fb5e7-a244-4758-8065-3615f5a2b9b7" (UID: "006fb5e7-a244-4758-8065-3615f5a2b9b7"). InnerVolumeSpecName "kube-api-access-zp2c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:04 crc kubenswrapper[4937]: I0225 16:13:04.941835 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-scripts" (OuterVolumeSpecName: "scripts") pod "006fb5e7-a244-4758-8065-3615f5a2b9b7" (UID: "006fb5e7-a244-4758-8065-3615f5a2b9b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.043308 4937 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/006fb5e7-a244-4758-8065-3615f5a2b9b7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.043336 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.043345 4937 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.043358 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp2c7\" (UniqueName: \"kubernetes.io/projected/006fb5e7-a244-4758-8065-3615f5a2b9b7-kube-api-access-zp2c7\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.045714 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6rpk2" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.046464 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6rpk2" event={"ID":"006fb5e7-a244-4758-8065-3615f5a2b9b7","Type":"ContainerDied","Data":"545b80e17fb664f28caed5eeb37b07fc6940230f941b48a47f04d1d5e8789ca2"} Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.046524 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="545b80e17fb664f28caed5eeb37b07fc6940230f941b48a47f04d1d5e8789ca2" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.092655 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "006fb5e7-a244-4758-8065-3615f5a2b9b7" (UID: "006fb5e7-a244-4758-8065-3615f5a2b9b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.098789 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-config-data" (OuterVolumeSpecName: "config-data") pod "006fb5e7-a244-4758-8065-3615f5a2b9b7" (UID: "006fb5e7-a244-4758-8065-3615f5a2b9b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.107522 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c757b5c5d-sqs2g"] Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.145124 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.145525 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006fb5e7-a244-4758-8065-3615f5a2b9b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.295378 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 16:13:05 crc kubenswrapper[4937]: E0225 16:13:05.295846 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="006fb5e7-a244-4758-8065-3615f5a2b9b7" containerName="cinder-db-sync" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.295865 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="006fb5e7-a244-4758-8065-3615f5a2b9b7" containerName="cinder-db-sync" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.296070 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="006fb5e7-a244-4758-8065-3615f5a2b9b7" containerName="cinder-db-sync" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.302123 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.315985 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.324195 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.410923 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vdxzb"] Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.453643 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-scripts\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.453699 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx8z8\" (UniqueName: \"kubernetes.io/projected/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-kube-api-access-rx8z8\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.453725 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.453788 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.453834 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.453854 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-config-data\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.469424 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-8ph6z"] Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.471413 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.508187 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-8ph6z"] Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.556502 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.556638 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.556688 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.556729 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-config-data\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.556770 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.556839 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.556862 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-scripts\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.556903 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjf6s\" (UniqueName: \"kubernetes.io/projected/2d9a76a7-a730-4436-956e-d43596599433-kube-api-access-hjf6s\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.556938 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx8z8\" (UniqueName: \"kubernetes.io/projected/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-kube-api-access-rx8z8\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.556990 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.557009 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-config\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.557060 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.557216 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.560057 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.564130 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.567745 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-scripts\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.574380 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-config-data\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.650775 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx8z8\" (UniqueName: \"kubernetes.io/projected/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-kube-api-access-rx8z8\") pod \"cinder-scheduler-0\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.653426 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.660542 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.660621 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.660652 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjf6s\" (UniqueName: \"kubernetes.io/projected/2d9a76a7-a730-4436-956e-d43596599433-kube-api-access-hjf6s\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.660698 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-config\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.660733 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.660797 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.667552 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.668414 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.672716 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.680015 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.684884 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-config\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.685416 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.696208 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjf6s\" (UniqueName: \"kubernetes.io/projected/2d9a76a7-a730-4436-956e-d43596599433-kube-api-access-hjf6s\") pod \"dnsmasq-dns-5c9776ccc5-8ph6z\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.762114 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.762412 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.764462 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.788025 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.833898 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.869642 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da32b76e-0420-4e12-8b28-8a865a41d899-logs\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.869733 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.869765 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-config-data\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.869810 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-config-data-custom\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.869838 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-scripts\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.869875 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgmzj\" (UniqueName: \"kubernetes.io/projected/da32b76e-0420-4e12-8b28-8a865a41d899-kube-api-access-zgmzj\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.869898 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da32b76e-0420-4e12-8b28-8a865a41d899-etc-machine-id\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.881895 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.970808 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/44849697-9b41-4439-b8c7-f497036543aa-certs\") pod \"44849697-9b41-4439-b8c7-f497036543aa\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.971235 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-scripts\") pod \"44849697-9b41-4439-b8c7-f497036543aa\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.971446 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-combined-ca-bundle\") pod \"44849697-9b41-4439-b8c7-f497036543aa\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.971511 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-config-data\") pod \"44849697-9b41-4439-b8c7-f497036543aa\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.971568 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn6jr\" (UniqueName: \"kubernetes.io/projected/44849697-9b41-4439-b8c7-f497036543aa-kube-api-access-tn6jr\") pod \"44849697-9b41-4439-b8c7-f497036543aa\" (UID: \"44849697-9b41-4439-b8c7-f497036543aa\") " Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.972337 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgmzj\" (UniqueName: \"kubernetes.io/projected/da32b76e-0420-4e12-8b28-8a865a41d899-kube-api-access-zgmzj\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.972386 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da32b76e-0420-4e12-8b28-8a865a41d899-etc-machine-id\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.972540 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da32b76e-0420-4e12-8b28-8a865a41d899-logs\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.972674 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.972711 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-config-data\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.973198 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da32b76e-0420-4e12-8b28-8a865a41d899-logs\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.974665 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da32b76e-0420-4e12-8b28-8a865a41d899-etc-machine-id\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.974768 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-config-data-custom\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.974816 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-scripts\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.980372 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44849697-9b41-4439-b8c7-f497036543aa-certs" (OuterVolumeSpecName: "certs") pod "44849697-9b41-4439-b8c7-f497036543aa" (UID: "44849697-9b41-4439-b8c7-f497036543aa"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.991546 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-scripts" (OuterVolumeSpecName: "scripts") pod "44849697-9b41-4439-b8c7-f497036543aa" (UID: "44849697-9b41-4439-b8c7-f497036543aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:05 crc kubenswrapper[4937]: I0225 16:13:05.999409 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgmzj\" (UniqueName: \"kubernetes.io/projected/da32b76e-0420-4e12-8b28-8a865a41d899-kube-api-access-zgmzj\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.002116 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-config-data-custom\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.002277 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44849697-9b41-4439-b8c7-f497036543aa-kube-api-access-tn6jr" (OuterVolumeSpecName: "kube-api-access-tn6jr") pod "44849697-9b41-4439-b8c7-f497036543aa" (UID: "44849697-9b41-4439-b8c7-f497036543aa"). InnerVolumeSpecName "kube-api-access-tn6jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.005325 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-config-data\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.006006 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-scripts\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.008244 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " pod="openstack/cinder-api-0" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.067827 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-config-data" (OuterVolumeSpecName: "config-data") pod "44849697-9b41-4439-b8c7-f497036543aa" (UID: "44849697-9b41-4439-b8c7-f497036543aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.071064 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b99d98bc-2r54q" event={"ID":"394cbe6e-1697-449d-abaf-68e9ba275096","Type":"ContainerStarted","Data":"1e8f7bbbdef83385b74ed49b4d59e92081fccd43fbbcdaed30f427dd00f8b3e3"} Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.071111 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b99d98bc-2r54q" event={"ID":"394cbe6e-1697-449d-abaf-68e9ba275096","Type":"ContainerStarted","Data":"fecf6cc73a0b104eba709f44433c0606dc0ffb88bca851c92d3b3af40ae22062"} Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.072562 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44849697-9b41-4439-b8c7-f497036543aa" (UID: "44849697-9b41-4439-b8c7-f497036543aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.076641 4937 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/44849697-9b41-4439-b8c7-f497036543aa-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.076668 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.076681 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.076693 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44849697-9b41-4439-b8c7-f497036543aa-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.076706 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn6jr\" (UniqueName: \"kubernetes.io/projected/44849697-9b41-4439-b8c7-f497036543aa-kube-api-access-tn6jr\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.085923 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" event={"ID":"32f1341d-07b8-4522-8541-c01f9e9ce74a","Type":"ContainerStarted","Data":"8bfe0409ad13c4fd4659fb5a7d3b54066bbc67991d00d4a2be9d72bb1372a224"} Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.086684 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.089192 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" event={"ID":"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5","Type":"ContainerStarted","Data":"04095f7ef09eab1dbb8f0db70e30e87e25249752ce2c7b3c711e35e19f925782"} Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.089219 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" event={"ID":"6520b70f-9bf6-4b3c-ad1e-4f43da8daec5","Type":"ContainerStarted","Data":"5664eaeda559d1158f18fa19971d006d4cc9388db838e08f8412fd60aaa9e062"} Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.092903 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c757b5c5d-sqs2g" event={"ID":"11e5858d-ee0a-4f76-8863-25be5ef4df36","Type":"ContainerStarted","Data":"b9626d68de1604e9c774bcb18c196acf50433c5281752d5f87e7677133de542b"} Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.092933 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c757b5c5d-sqs2g" event={"ID":"11e5858d-ee0a-4f76-8863-25be5ef4df36","Type":"ContainerStarted","Data":"ac43150d6d8ee5c8cb68fc82a3a1e105efeefc8c0b937d507051796c1e093a1e"} Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.097471 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-b4zhr" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.097642 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-b4zhr" event={"ID":"44849697-9b41-4439-b8c7-f497036543aa","Type":"ContainerDied","Data":"43b2a7c2ab5c47d272d4eb451a698820cdbc1a492e4f98268bf7165d67dc0ad4"} Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.097671 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43b2a7c2ab5c47d272d4eb451a698820cdbc1a492e4f98268bf7165d67dc0ad4" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.116560 4937 generic.go:334] "Generic (PLEG): container finished" podID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerID="975bf9c7209b653fdf9d6a2d25684b654b89782f6a2efef799f7e3f9f4ae31b7" exitCode=0 Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.116943 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ef5e4a5-46f1-4f72-ab91-699865d33243","Type":"ContainerDied","Data":"975bf9c7209b653fdf9d6a2d25684b654b89782f6a2efef799f7e3f9f4ae31b7"} Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.121833 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6b99d98bc-2r54q" podStartSLOduration=3.087372243 podStartE2EDuration="5.121822245s" podCreationTimestamp="2026-02-25 16:13:01 +0000 UTC" firstStartedPulling="2026-02-25 16:13:02.42336344 +0000 UTC m=+1633.436755330" lastFinishedPulling="2026-02-25 16:13:04.457813452 +0000 UTC m=+1635.471205332" observedRunningTime="2026-02-25 16:13:06.11203721 +0000 UTC m=+1637.125429090" watchObservedRunningTime="2026-02-25 16:13:06.121822245 +0000 UTC m=+1637.135214135" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.176994 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6b9499bbcd-kr7kb" podStartSLOduration=3.421178385 podStartE2EDuration="5.176976007s" podCreationTimestamp="2026-02-25 16:13:01 +0000 UTC" firstStartedPulling="2026-02-25 16:13:02.722537464 +0000 UTC m=+1633.735929354" lastFinishedPulling="2026-02-25 16:13:04.478335086 +0000 UTC m=+1635.491726976" observedRunningTime="2026-02-25 16:13:06.165888759 +0000 UTC m=+1637.179280649" watchObservedRunningTime="2026-02-25 16:13:06.176976007 +0000 UTC m=+1637.190367897" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.187059 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.246244 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" podStartSLOduration=5.246224161 podStartE2EDuration="5.246224161s" podCreationTimestamp="2026-02-25 16:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:06.207889571 +0000 UTC m=+1637.221281461" watchObservedRunningTime="2026-02-25 16:13:06.246224161 +0000 UTC m=+1637.259616051" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.311238 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-589f6455c9-dwk7g"] Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.311508 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-589f6455c9-dwk7g" podUID="2a9fc39a-301c-48eb-8ae0-238271352711" containerName="neutron-api" containerID="cri-o://76089335ac4fa0fe8873cf19d70b49ebd7b8a0769aafc4dc46234b25f3fdc49e" gracePeriod=30 Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.313738 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-589f6455c9-dwk7g" podUID="2a9fc39a-301c-48eb-8ae0-238271352711" containerName="neutron-httpd" containerID="cri-o://fdaa307ea492e884bba54e6a8e0d99cfc57123b3e821d9bc28312c86a1e72f1a" gracePeriod=30 Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.337328 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.346841 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.446556 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57ff6d8577-ntrmb"] Feb 25 16:13:06 crc kubenswrapper[4937]: E0225 16:13:06.447235 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44849697-9b41-4439-b8c7-f497036543aa" containerName="cloudkitty-db-sync" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.447304 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="44849697-9b41-4439-b8c7-f497036543aa" containerName="cloudkitty-db-sync" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.447588 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="44849697-9b41-4439-b8c7-f497036543aa" containerName="cloudkitty-db-sync" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.448772 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: W0225 16:13:06.450097 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad34bc64_7581_4b37_8b13_b2fbdbb6e901.slice/crio-07ea9d6d98f1956ccdf39c60d023f82b464bd6b5182280c7ad25eebcae28a867 WatchSource:0}: Error finding container 07ea9d6d98f1956ccdf39c60d023f82b464bd6b5182280c7ad25eebcae28a867: Status 404 returned error can't find the container with id 07ea9d6d98f1956ccdf39c60d023f82b464bd6b5182280c7ad25eebcae28a867 Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.514884 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57ff6d8577-ntrmb"] Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.553605 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-qb2mq"] Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.555408 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.564547 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-qb2mq"] Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.565155 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.565338 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-5lx8b" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.565471 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.565647 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.566153 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.606020 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-combined-ca-bundle\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.606060 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4d12dac4-3aaf-41e5-aff8-68749f020d89-certs\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.606148 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-config-data\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.606176 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-internal-tls-certs\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.606195 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q44k\" (UniqueName: \"kubernetes.io/projected/351a0bd5-2cd4-4f52-af68-6d86a512add0-kube-api-access-5q44k\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.606249 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-combined-ca-bundle\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.606311 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-httpd-config\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.606350 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-public-tls-certs\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.606406 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-ovndb-tls-certs\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.606441 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qggwf\" (UniqueName: \"kubernetes.io/projected/4d12dac4-3aaf-41e5-aff8-68749f020d89-kube-api-access-qggwf\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.606472 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-scripts\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.606520 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-config\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.702496 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-8ph6z"] Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.712710 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-httpd-config\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.712768 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-public-tls-certs\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.712816 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-ovndb-tls-certs\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.712849 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qggwf\" (UniqueName: \"kubernetes.io/projected/4d12dac4-3aaf-41e5-aff8-68749f020d89-kube-api-access-qggwf\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.712882 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-scripts\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.712915 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-config\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.712999 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-combined-ca-bundle\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.713019 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4d12dac4-3aaf-41e5-aff8-68749f020d89-certs\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.713068 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-config-data\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.713097 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q44k\" (UniqueName: \"kubernetes.io/projected/351a0bd5-2cd4-4f52-af68-6d86a512add0-kube-api-access-5q44k\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.713117 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-internal-tls-certs\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.713160 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-combined-ca-bundle\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.722752 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-scripts\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.731835 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-ovndb-tls-certs\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.735802 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-combined-ca-bundle\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.737775 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-config\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.738374 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-combined-ca-bundle\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.739516 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4d12dac4-3aaf-41e5-aff8-68749f020d89-certs\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.744182 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-httpd-config\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.748616 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q44k\" (UniqueName: \"kubernetes.io/projected/351a0bd5-2cd4-4f52-af68-6d86a512add0-kube-api-access-5q44k\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.752302 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-internal-tls-certs\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.752869 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-config-data\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.753550 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/351a0bd5-2cd4-4f52-af68-6d86a512add0-public-tls-certs\") pod \"neutron-57ff6d8577-ntrmb\" (UID: \"351a0bd5-2cd4-4f52-af68-6d86a512add0\") " pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.771277 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qggwf\" (UniqueName: \"kubernetes.io/projected/4d12dac4-3aaf-41e5-aff8-68749f020d89-kube-api-access-qggwf\") pod \"cloudkitty-storageinit-qb2mq\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.797025 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:06 crc kubenswrapper[4937]: I0225 16:13:06.974458 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.153746 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.170790 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-754k4" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="registry-server" probeResult="failure" output=< Feb 25 16:13:07 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 16:13:07 crc kubenswrapper[4937]: > Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.246085 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" event={"ID":"2d9a76a7-a730-4436-956e-d43596599433","Type":"ContainerStarted","Data":"f2f871d73458e61edf424529f66db7d7660497899f89390b2407be4fd8bf1029"} Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.260355 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ad34bc64-7581-4b37-8b13-b2fbdbb6e901","Type":"ContainerStarted","Data":"07ea9d6d98f1956ccdf39c60d023f82b464bd6b5182280c7ad25eebcae28a867"} Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.273337 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c757b5c5d-sqs2g" event={"ID":"11e5858d-ee0a-4f76-8863-25be5ef4df36","Type":"ContainerStarted","Data":"b9239770cc206a238b28d7b875fe9abd9f08e505b72db47c1d45f76384b3494c"} Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.273440 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.273475 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.280707 4937 generic.go:334] "Generic (PLEG): container finished" podID="2a9fc39a-301c-48eb-8ae0-238271352711" containerID="fdaa307ea492e884bba54e6a8e0d99cfc57123b3e821d9bc28312c86a1e72f1a" exitCode=0 Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.281519 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589f6455c9-dwk7g" event={"ID":"2a9fc39a-301c-48eb-8ae0-238271352711","Type":"ContainerDied","Data":"fdaa307ea492e884bba54e6a8e0d99cfc57123b3e821d9bc28312c86a1e72f1a"} Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.283831 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" podUID="32f1341d-07b8-4522-8541-c01f9e9ce74a" containerName="dnsmasq-dns" containerID="cri-o://8bfe0409ad13c4fd4659fb5a7d3b54066bbc67991d00d4a2be9d72bb1372a224" gracePeriod=10 Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.321575 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6c757b5c5d-sqs2g" podStartSLOduration=3.321551127 podStartE2EDuration="3.321551127s" podCreationTimestamp="2026-02-25 16:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:07.307091585 +0000 UTC m=+1638.320483485" watchObservedRunningTime="2026-02-25 16:13:07.321551127 +0000 UTC m=+1638.334943027" Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.594138 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57ff6d8577-ntrmb"] Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.633099 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-qb2mq"] Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.720095 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.764077 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ef5e4a5-46f1-4f72-ab91-699865d33243-log-httpd\") pod \"8ef5e4a5-46f1-4f72-ab91-699865d33243\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.764120 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ef5e4a5-46f1-4f72-ab91-699865d33243-run-httpd\") pod \"8ef5e4a5-46f1-4f72-ab91-699865d33243\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.764140 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-scripts\") pod \"8ef5e4a5-46f1-4f72-ab91-699865d33243\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.764204 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgg8j\" (UniqueName: \"kubernetes.io/projected/8ef5e4a5-46f1-4f72-ab91-699865d33243-kube-api-access-hgg8j\") pod \"8ef5e4a5-46f1-4f72-ab91-699865d33243\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.764224 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-combined-ca-bundle\") pod \"8ef5e4a5-46f1-4f72-ab91-699865d33243\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.764357 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-sg-core-conf-yaml\") pod \"8ef5e4a5-46f1-4f72-ab91-699865d33243\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.764424 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-config-data\") pod \"8ef5e4a5-46f1-4f72-ab91-699865d33243\" (UID: \"8ef5e4a5-46f1-4f72-ab91-699865d33243\") " Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.773950 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef5e4a5-46f1-4f72-ab91-699865d33243-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8ef5e4a5-46f1-4f72-ab91-699865d33243" (UID: "8ef5e4a5-46f1-4f72-ab91-699865d33243"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.774607 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef5e4a5-46f1-4f72-ab91-699865d33243-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8ef5e4a5-46f1-4f72-ab91-699865d33243" (UID: "8ef5e4a5-46f1-4f72-ab91-699865d33243"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.775681 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-scripts" (OuterVolumeSpecName: "scripts") pod "8ef5e4a5-46f1-4f72-ab91-699865d33243" (UID: "8ef5e4a5-46f1-4f72-ab91-699865d33243"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.793249 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef5e4a5-46f1-4f72-ab91-699865d33243-kube-api-access-hgg8j" (OuterVolumeSpecName: "kube-api-access-hgg8j") pod "8ef5e4a5-46f1-4f72-ab91-699865d33243" (UID: "8ef5e4a5-46f1-4f72-ab91-699865d33243"). InnerVolumeSpecName "kube-api-access-hgg8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:07 crc kubenswrapper[4937]: E0225 16:13:07.806985 4937 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32f1341d_07b8_4522_8541_c01f9e9ce74a.slice/crio-8bfe0409ad13c4fd4659fb5a7d3b54066bbc67991d00d4a2be9d72bb1372a224.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d9a76a7_a730_4436_956e_d43596599433.slice/crio-conmon-784fd775475f4891f5b4fd847b9f1778442d353bae9b82a46c3287cd29e4b0f7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d9a76a7_a730_4436_956e_d43596599433.slice/crio-784fd775475f4891f5b4fd847b9f1778442d353bae9b82a46c3287cd29e4b0f7.scope\": RecentStats: unable to find data in memory cache]" Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.865906 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgg8j\" (UniqueName: \"kubernetes.io/projected/8ef5e4a5-46f1-4f72-ab91-699865d33243-kube-api-access-hgg8j\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.865946 4937 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ef5e4a5-46f1-4f72-ab91-699865d33243-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.865969 4937 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ef5e4a5-46f1-4f72-ab91-699865d33243-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.865978 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:07 crc kubenswrapper[4937]: W0225 16:13:07.906367 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod351a0bd5_2cd4_4f52_af68_6d86a512add0.slice/crio-3522498ebc5ec4e9c9bedf0ffa3977e2b78032906c7bac885ed06d903ed26e6e WatchSource:0}: Error finding container 3522498ebc5ec4e9c9bedf0ffa3977e2b78032906c7bac885ed06d903ed26e6e: Status 404 returned error can't find the container with id 3522498ebc5ec4e9c9bedf0ffa3977e2b78032906c7bac885ed06d903ed26e6e Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.906597 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8ef5e4a5-46f1-4f72-ab91-699865d33243" (UID: "8ef5e4a5-46f1-4f72-ab91-699865d33243"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.916019 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 25 16:13:07 crc kubenswrapper[4937]: I0225 16:13:07.975182 4937 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.023913 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ef5e4a5-46f1-4f72-ab91-699865d33243" (UID: "8ef5e4a5-46f1-4f72-ab91-699865d33243"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.053328 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.055203 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-config-data" (OuterVolumeSpecName: "config-data") pod "8ef5e4a5-46f1-4f72-ab91-699865d33243" (UID: "8ef5e4a5-46f1-4f72-ab91-699865d33243"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.077507 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.077533 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef5e4a5-46f1-4f72-ab91-699865d33243-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.136588 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-589f6455c9-dwk7g" podUID="2a9fc39a-301c-48eb-8ae0-238271352711" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.178:9696/\": dial tcp 10.217.0.178:9696: connect: connection refused" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.178950 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-dns-swift-storage-0\") pod \"32f1341d-07b8-4522-8541-c01f9e9ce74a\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.179225 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-ovsdbserver-nb\") pod \"32f1341d-07b8-4522-8541-c01f9e9ce74a\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.180081 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-config\") pod \"32f1341d-07b8-4522-8541-c01f9e9ce74a\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.180132 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffr8d\" (UniqueName: \"kubernetes.io/projected/32f1341d-07b8-4522-8541-c01f9e9ce74a-kube-api-access-ffr8d\") pod \"32f1341d-07b8-4522-8541-c01f9e9ce74a\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.180170 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-dns-svc\") pod \"32f1341d-07b8-4522-8541-c01f9e9ce74a\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.180315 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-ovsdbserver-sb\") pod \"32f1341d-07b8-4522-8541-c01f9e9ce74a\" (UID: \"32f1341d-07b8-4522-8541-c01f9e9ce74a\") " Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.204937 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32f1341d-07b8-4522-8541-c01f9e9ce74a-kube-api-access-ffr8d" (OuterVolumeSpecName: "kube-api-access-ffr8d") pod "32f1341d-07b8-4522-8541-c01f9e9ce74a" (UID: "32f1341d-07b8-4522-8541-c01f9e9ce74a"). InnerVolumeSpecName "kube-api-access-ffr8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.288281 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffr8d\" (UniqueName: \"kubernetes.io/projected/32f1341d-07b8-4522-8541-c01f9e9ce74a-kube-api-access-ffr8d\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.350728 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da32b76e-0420-4e12-8b28-8a865a41d899","Type":"ContainerStarted","Data":"265d2d2be741a72e22cd5d8c6b3fbac8b06cd3541eca37d798ce7a1350aacb3f"} Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.365771 4937 generic.go:334] "Generic (PLEG): container finished" podID="32f1341d-07b8-4522-8541-c01f9e9ce74a" containerID="8bfe0409ad13c4fd4659fb5a7d3b54066bbc67991d00d4a2be9d72bb1372a224" exitCode=0 Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.365911 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.365975 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" event={"ID":"32f1341d-07b8-4522-8541-c01f9e9ce74a","Type":"ContainerDied","Data":"8bfe0409ad13c4fd4659fb5a7d3b54066bbc67991d00d4a2be9d72bb1372a224"} Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.366015 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vdxzb" event={"ID":"32f1341d-07b8-4522-8541-c01f9e9ce74a","Type":"ContainerDied","Data":"0c2e732b702e521f8c983d9001fda18bc4b5480e98d67a645a4e4986b705a835"} Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.366035 4937 scope.go:117] "RemoveContainer" containerID="8bfe0409ad13c4fd4659fb5a7d3b54066bbc67991d00d4a2be9d72bb1372a224" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.378839 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ef5e4a5-46f1-4f72-ab91-699865d33243","Type":"ContainerDied","Data":"c4beda9d4e738b69c4df08c3b7dd682f48d63081659c82a5b48be8d24813114f"} Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.378963 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.380834 4937 generic.go:334] "Generic (PLEG): container finished" podID="2d9a76a7-a730-4436-956e-d43596599433" containerID="784fd775475f4891f5b4fd847b9f1778442d353bae9b82a46c3287cd29e4b0f7" exitCode=0 Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.380887 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" event={"ID":"2d9a76a7-a730-4436-956e-d43596599433","Type":"ContainerDied","Data":"784fd775475f4891f5b4fd847b9f1778442d353bae9b82a46c3287cd29e4b0f7"} Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.382721 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57ff6d8577-ntrmb" event={"ID":"351a0bd5-2cd4-4f52-af68-6d86a512add0","Type":"ContainerStarted","Data":"3522498ebc5ec4e9c9bedf0ffa3977e2b78032906c7bac885ed06d903ed26e6e"} Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.394229 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-qb2mq" event={"ID":"4d12dac4-3aaf-41e5-aff8-68749f020d89","Type":"ContainerStarted","Data":"75bfb2f4a2807303c98be1f6bc72c4ebe2da6637e63184de7a194045ef99a2aa"} Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.422046 4937 scope.go:117] "RemoveContainer" containerID="eaf7f5567bab5c5686a976e8d10b599dc3fa8260408f62be89c0e93ba224c394" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.444589 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-config" (OuterVolumeSpecName: "config") pod "32f1341d-07b8-4522-8541-c01f9e9ce74a" (UID: "32f1341d-07b8-4522-8541-c01f9e9ce74a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.470537 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "32f1341d-07b8-4522-8541-c01f9e9ce74a" (UID: "32f1341d-07b8-4522-8541-c01f9e9ce74a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.494416 4937 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.494443 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.508177 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.522054 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.530517 4937 scope.go:117] "RemoveContainer" containerID="8bfe0409ad13c4fd4659fb5a7d3b54066bbc67991d00d4a2be9d72bb1372a224" Feb 25 16:13:08 crc kubenswrapper[4937]: E0225 16:13:08.531671 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bfe0409ad13c4fd4659fb5a7d3b54066bbc67991d00d4a2be9d72bb1372a224\": container with ID starting with 8bfe0409ad13c4fd4659fb5a7d3b54066bbc67991d00d4a2be9d72bb1372a224 not found: ID does not exist" containerID="8bfe0409ad13c4fd4659fb5a7d3b54066bbc67991d00d4a2be9d72bb1372a224" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.531706 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bfe0409ad13c4fd4659fb5a7d3b54066bbc67991d00d4a2be9d72bb1372a224"} err="failed to get container status \"8bfe0409ad13c4fd4659fb5a7d3b54066bbc67991d00d4a2be9d72bb1372a224\": rpc error: code = NotFound desc = could not find container \"8bfe0409ad13c4fd4659fb5a7d3b54066bbc67991d00d4a2be9d72bb1372a224\": container with ID starting with 8bfe0409ad13c4fd4659fb5a7d3b54066bbc67991d00d4a2be9d72bb1372a224 not found: ID does not exist" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.531734 4937 scope.go:117] "RemoveContainer" containerID="eaf7f5567bab5c5686a976e8d10b599dc3fa8260408f62be89c0e93ba224c394" Feb 25 16:13:08 crc kubenswrapper[4937]: E0225 16:13:08.532630 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaf7f5567bab5c5686a976e8d10b599dc3fa8260408f62be89c0e93ba224c394\": container with ID starting with eaf7f5567bab5c5686a976e8d10b599dc3fa8260408f62be89c0e93ba224c394 not found: ID does not exist" containerID="eaf7f5567bab5c5686a976e8d10b599dc3fa8260408f62be89c0e93ba224c394" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.532674 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaf7f5567bab5c5686a976e8d10b599dc3fa8260408f62be89c0e93ba224c394"} err="failed to get container status \"eaf7f5567bab5c5686a976e8d10b599dc3fa8260408f62be89c0e93ba224c394\": rpc error: code = NotFound desc = could not find container \"eaf7f5567bab5c5686a976e8d10b599dc3fa8260408f62be89c0e93ba224c394\": container with ID starting with eaf7f5567bab5c5686a976e8d10b599dc3fa8260408f62be89c0e93ba224c394 not found: ID does not exist" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.532704 4937 scope.go:117] "RemoveContainer" containerID="8b4a77614d7a0cf6d8e9ddcef449aae52cf40dd692c6145777ac09c87acce818" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.537819 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:08 crc kubenswrapper[4937]: E0225 16:13:08.538304 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerName="ceilometer-notification-agent" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.538327 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerName="ceilometer-notification-agent" Feb 25 16:13:08 crc kubenswrapper[4937]: E0225 16:13:08.538349 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1341d-07b8-4522-8541-c01f9e9ce74a" containerName="dnsmasq-dns" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.538358 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1341d-07b8-4522-8541-c01f9e9ce74a" containerName="dnsmasq-dns" Feb 25 16:13:08 crc kubenswrapper[4937]: E0225 16:13:08.538377 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1341d-07b8-4522-8541-c01f9e9ce74a" containerName="init" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.538387 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1341d-07b8-4522-8541-c01f9e9ce74a" containerName="init" Feb 25 16:13:08 crc kubenswrapper[4937]: E0225 16:13:08.538414 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerName="sg-core" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.538422 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerName="sg-core" Feb 25 16:13:08 crc kubenswrapper[4937]: E0225 16:13:08.538433 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerName="proxy-httpd" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.538442 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerName="proxy-httpd" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.538682 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerName="sg-core" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.538705 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerName="ceilometer-notification-agent" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.538717 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f1341d-07b8-4522-8541-c01f9e9ce74a" containerName="dnsmasq-dns" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.538731 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef5e4a5-46f1-4f72-ab91-699865d33243" containerName="proxy-httpd" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.541019 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.547381 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.547625 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.553697 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.587719 4937 scope.go:117] "RemoveContainer" containerID="be8325fa6e888793158a1f475725134ab900615886cb7f8cc349d8d9d4c67032" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.597462 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rblq\" (UniqueName: \"kubernetes.io/projected/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-kube-api-access-6rblq\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.603977 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-config-data\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.604063 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-run-httpd\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.604101 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-scripts\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.604351 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.604555 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-log-httpd\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.604648 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.633233 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32f1341d-07b8-4522-8541-c01f9e9ce74a" (UID: "32f1341d-07b8-4522-8541-c01f9e9ce74a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.692061 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "32f1341d-07b8-4522-8541-c01f9e9ce74a" (UID: "32f1341d-07b8-4522-8541-c01f9e9ce74a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.698858 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "32f1341d-07b8-4522-8541-c01f9e9ce74a" (UID: "32f1341d-07b8-4522-8541-c01f9e9ce74a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.706539 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-log-httpd\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.706599 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.706622 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rblq\" (UniqueName: \"kubernetes.io/projected/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-kube-api-access-6rblq\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.706695 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-config-data\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.706717 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-run-httpd\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.706734 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-scripts\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.706810 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.706906 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.706922 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.706933 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32f1341d-07b8-4522-8541-c01f9e9ce74a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.708019 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-log-httpd\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.709128 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-run-httpd\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.716031 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.727777 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.728052 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-scripts\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.728227 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-config-data\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.728682 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rblq\" (UniqueName: \"kubernetes.io/projected/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-kube-api-access-6rblq\") pod \"ceilometer-0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " pod="openstack/ceilometer-0" Feb 25 16:13:08 crc kubenswrapper[4937]: I0225 16:13:08.879794 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:13:09 crc kubenswrapper[4937]: I0225 16:13:09.021672 4937 scope.go:117] "RemoveContainer" containerID="975bf9c7209b653fdf9d6a2d25684b654b89782f6a2efef799f7e3f9f4ae31b7" Feb 25 16:13:09 crc kubenswrapper[4937]: I0225 16:13:09.116227 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vdxzb"] Feb 25 16:13:09 crc kubenswrapper[4937]: I0225 16:13:09.146156 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vdxzb"] Feb 25 16:13:09 crc kubenswrapper[4937]: I0225 16:13:09.397284 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32f1341d-07b8-4522-8541-c01f9e9ce74a" path="/var/lib/kubelet/pods/32f1341d-07b8-4522-8541-c01f9e9ce74a/volumes" Feb 25 16:13:09 crc kubenswrapper[4937]: I0225 16:13:09.406963 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef5e4a5-46f1-4f72-ab91-699865d33243" path="/var/lib/kubelet/pods/8ef5e4a5-46f1-4f72-ab91-699865d33243/volumes" Feb 25 16:13:09 crc kubenswrapper[4937]: I0225 16:13:09.428849 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57ff6d8577-ntrmb" event={"ID":"351a0bd5-2cd4-4f52-af68-6d86a512add0","Type":"ContainerStarted","Data":"84c65e5e0d15e3ffaf838ec0c4dc92f99e9ba4fec09df3ffc6e328763ae9abe9"} Feb 25 16:13:09 crc kubenswrapper[4937]: I0225 16:13:09.428895 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57ff6d8577-ntrmb" event={"ID":"351a0bd5-2cd4-4f52-af68-6d86a512add0","Type":"ContainerStarted","Data":"f5e9b485b7059298b86370a61e0873a566aa450beb27ed81dd2ee889032d23f7"} Feb 25 16:13:09 crc kubenswrapper[4937]: I0225 16:13:09.430006 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:09 crc kubenswrapper[4937]: I0225 16:13:09.442948 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-qb2mq" event={"ID":"4d12dac4-3aaf-41e5-aff8-68749f020d89","Type":"ContainerStarted","Data":"83fa1dba06b54c83091f7f6bfeac3df28e8fb5313d3de3b14bfb9ee33cfda3d4"} Feb 25 16:13:09 crc kubenswrapper[4937]: I0225 16:13:09.445392 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da32b76e-0420-4e12-8b28-8a865a41d899","Type":"ContainerStarted","Data":"1ec406ed044c94889f7711069205d579fbf677ca16fffde51f2ebc94f444aaf1"} Feb 25 16:13:09 crc kubenswrapper[4937]: I0225 16:13:09.462857 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57ff6d8577-ntrmb" podStartSLOduration=3.462835566 podStartE2EDuration="3.462835566s" podCreationTimestamp="2026-02-25 16:13:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:09.455648516 +0000 UTC m=+1640.469040406" watchObservedRunningTime="2026-02-25 16:13:09.462835566 +0000 UTC m=+1640.476227456" Feb 25 16:13:09 crc kubenswrapper[4937]: I0225 16:13:09.481923 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-qb2mq" podStartSLOduration=3.481901063 podStartE2EDuration="3.481901063s" podCreationTimestamp="2026-02-25 16:13:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:09.481075023 +0000 UTC m=+1640.494466913" watchObservedRunningTime="2026-02-25 16:13:09.481901063 +0000 UTC m=+1640.495292953" Feb 25 16:13:09 crc kubenswrapper[4937]: I0225 16:13:09.652153 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:09 crc kubenswrapper[4937]: W0225 16:13:09.668438 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a2d638a_ffd8_4721_be54_eb6b911ffbf0.slice/crio-a13a4896360173233a547fd6c2c57b3891482c6cfb7e6a580e77b7285b50aa5e WatchSource:0}: Error finding container a13a4896360173233a547fd6c2c57b3891482c6cfb7e6a580e77b7285b50aa5e: Status 404 returned error can't find the container with id a13a4896360173233a547fd6c2c57b3891482c6cfb7e6a580e77b7285b50aa5e Feb 25 16:13:10 crc kubenswrapper[4937]: I0225 16:13:10.459965 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da32b76e-0420-4e12-8b28-8a865a41d899","Type":"ContainerStarted","Data":"4ecadf33cfb0137001239dc2fa3d24886888be08c239db6e6a840a35347849c1"} Feb 25 16:13:10 crc kubenswrapper[4937]: I0225 16:13:10.460594 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 25 16:13:10 crc kubenswrapper[4937]: I0225 16:13:10.460096 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="da32b76e-0420-4e12-8b28-8a865a41d899" containerName="cinder-api" containerID="cri-o://4ecadf33cfb0137001239dc2fa3d24886888be08c239db6e6a840a35347849c1" gracePeriod=30 Feb 25 16:13:10 crc kubenswrapper[4937]: I0225 16:13:10.460044 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="da32b76e-0420-4e12-8b28-8a865a41d899" containerName="cinder-api-log" containerID="cri-o://1ec406ed044c94889f7711069205d579fbf677ca16fffde51f2ebc94f444aaf1" gracePeriod=30 Feb 25 16:13:10 crc kubenswrapper[4937]: I0225 16:13:10.465199 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" event={"ID":"2d9a76a7-a730-4436-956e-d43596599433","Type":"ContainerStarted","Data":"dac55af911733a13003687c9df498f486d492219add91cedc53aaadb23c0ef74"} Feb 25 16:13:10 crc kubenswrapper[4937]: I0225 16:13:10.465334 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:10 crc kubenswrapper[4937]: I0225 16:13:10.467548 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ad34bc64-7581-4b37-8b13-b2fbdbb6e901","Type":"ContainerStarted","Data":"b1d82a12623c050ad7171bce388331a3044d60a635fda8eab8be7f40208d6175"} Feb 25 16:13:10 crc kubenswrapper[4937]: I0225 16:13:10.469292 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a2d638a-ffd8-4721-be54-eb6b911ffbf0","Type":"ContainerStarted","Data":"a13a4896360173233a547fd6c2c57b3891482c6cfb7e6a580e77b7285b50aa5e"} Feb 25 16:13:10 crc kubenswrapper[4937]: I0225 16:13:10.496592 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.4964744979999995 podStartE2EDuration="5.496474498s" podCreationTimestamp="2026-02-25 16:13:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:10.481458852 +0000 UTC m=+1641.494850742" watchObservedRunningTime="2026-02-25 16:13:10.496474498 +0000 UTC m=+1641.509866398" Feb 25 16:13:10 crc kubenswrapper[4937]: I0225 16:13:10.521456 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" podStartSLOduration=5.521429773 podStartE2EDuration="5.521429773s" podCreationTimestamp="2026-02-25 16:13:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:10.510239313 +0000 UTC m=+1641.523631223" watchObservedRunningTime="2026-02-25 16:13:10.521429773 +0000 UTC m=+1641.534821683" Feb 25 16:13:11 crc kubenswrapper[4937]: I0225 16:13:11.500822 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:13:11 crc kubenswrapper[4937]: I0225 16:13:11.501237 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:13:11 crc kubenswrapper[4937]: I0225 16:13:11.526605 4937 generic.go:334] "Generic (PLEG): container finished" podID="da32b76e-0420-4e12-8b28-8a865a41d899" containerID="1ec406ed044c94889f7711069205d579fbf677ca16fffde51f2ebc94f444aaf1" exitCode=143 Feb 25 16:13:11 crc kubenswrapper[4937]: I0225 16:13:11.526919 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da32b76e-0420-4e12-8b28-8a865a41d899","Type":"ContainerDied","Data":"1ec406ed044c94889f7711069205d579fbf677ca16fffde51f2ebc94f444aaf1"} Feb 25 16:13:11 crc kubenswrapper[4937]: I0225 16:13:11.529829 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ad34bc64-7581-4b37-8b13-b2fbdbb6e901","Type":"ContainerStarted","Data":"9bb2f38664972759ef6d2f7a0dd9f198d87ecb83a270a169060d72002b2d0835"} Feb 25 16:13:11 crc kubenswrapper[4937]: I0225 16:13:11.546013 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a2d638a-ffd8-4721-be54-eb6b911ffbf0","Type":"ContainerStarted","Data":"be08f64bb9075dd47090696dff9b7393cb09a71c381a4fb20ffd8f490a4757b4"} Feb 25 16:13:11 crc kubenswrapper[4937]: I0225 16:13:11.556978 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.928539321 podStartE2EDuration="6.556950782s" podCreationTimestamp="2026-02-25 16:13:05 +0000 UTC" firstStartedPulling="2026-02-25 16:13:06.495643489 +0000 UTC m=+1637.509035379" lastFinishedPulling="2026-02-25 16:13:08.12405495 +0000 UTC m=+1639.137446840" observedRunningTime="2026-02-25 16:13:11.553130467 +0000 UTC m=+1642.566522357" watchObservedRunningTime="2026-02-25 16:13:11.556950782 +0000 UTC m=+1642.570342682" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.300733 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.425127 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-config\") pod \"2a9fc39a-301c-48eb-8ae0-238271352711\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.425205 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-combined-ca-bundle\") pod \"2a9fc39a-301c-48eb-8ae0-238271352711\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.425245 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-internal-tls-certs\") pod \"2a9fc39a-301c-48eb-8ae0-238271352711\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.425333 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxwhn\" (UniqueName: \"kubernetes.io/projected/2a9fc39a-301c-48eb-8ae0-238271352711-kube-api-access-gxwhn\") pod \"2a9fc39a-301c-48eb-8ae0-238271352711\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.425406 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-httpd-config\") pod \"2a9fc39a-301c-48eb-8ae0-238271352711\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.425535 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-ovndb-tls-certs\") pod \"2a9fc39a-301c-48eb-8ae0-238271352711\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.425572 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-public-tls-certs\") pod \"2a9fc39a-301c-48eb-8ae0-238271352711\" (UID: \"2a9fc39a-301c-48eb-8ae0-238271352711\") " Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.460766 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a9fc39a-301c-48eb-8ae0-238271352711-kube-api-access-gxwhn" (OuterVolumeSpecName: "kube-api-access-gxwhn") pod "2a9fc39a-301c-48eb-8ae0-238271352711" (UID: "2a9fc39a-301c-48eb-8ae0-238271352711"). InnerVolumeSpecName "kube-api-access-gxwhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.483882 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2a9fc39a-301c-48eb-8ae0-238271352711" (UID: "2a9fc39a-301c-48eb-8ae0-238271352711"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.530656 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxwhn\" (UniqueName: \"kubernetes.io/projected/2a9fc39a-301c-48eb-8ae0-238271352711-kube-api-access-gxwhn\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.531833 4937 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.592505 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a2d638a-ffd8-4721-be54-eb6b911ffbf0","Type":"ContainerStarted","Data":"dc287f1e28ef0946a0a899f78118307dbe945bcabd976b94f3ab313d8b43fd20"} Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.604284 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a9fc39a-301c-48eb-8ae0-238271352711" (UID: "2a9fc39a-301c-48eb-8ae0-238271352711"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.630866 4937 generic.go:334] "Generic (PLEG): container finished" podID="2a9fc39a-301c-48eb-8ae0-238271352711" containerID="76089335ac4fa0fe8873cf19d70b49ebd7b8a0769aafc4dc46234b25f3fdc49e" exitCode=0 Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.631423 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-589f6455c9-dwk7g" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.631679 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589f6455c9-dwk7g" event={"ID":"2a9fc39a-301c-48eb-8ae0-238271352711","Type":"ContainerDied","Data":"76089335ac4fa0fe8873cf19d70b49ebd7b8a0769aafc4dc46234b25f3fdc49e"} Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.631714 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589f6455c9-dwk7g" event={"ID":"2a9fc39a-301c-48eb-8ae0-238271352711","Type":"ContainerDied","Data":"b649795795191f62ac09faac8c80c69067a83631ed639e3ea2f9db1a79474377"} Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.631752 4937 scope.go:117] "RemoveContainer" containerID="fdaa307ea492e884bba54e6a8e0d99cfc57123b3e821d9bc28312c86a1e72f1a" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.634900 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.671680 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2a9fc39a-301c-48eb-8ae0-238271352711" (UID: "2a9fc39a-301c-48eb-8ae0-238271352711"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.672076 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-config" (OuterVolumeSpecName: "config") pod "2a9fc39a-301c-48eb-8ae0-238271352711" (UID: "2a9fc39a-301c-48eb-8ae0-238271352711"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.674600 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2a9fc39a-301c-48eb-8ae0-238271352711" (UID: "2a9fc39a-301c-48eb-8ae0-238271352711"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.687600 4937 scope.go:117] "RemoveContainer" containerID="76089335ac4fa0fe8873cf19d70b49ebd7b8a0769aafc4dc46234b25f3fdc49e" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.706624 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2a9fc39a-301c-48eb-8ae0-238271352711" (UID: "2a9fc39a-301c-48eb-8ae0-238271352711"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.726121 4937 scope.go:117] "RemoveContainer" containerID="fdaa307ea492e884bba54e6a8e0d99cfc57123b3e821d9bc28312c86a1e72f1a" Feb 25 16:13:12 crc kubenswrapper[4937]: E0225 16:13:12.734899 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdaa307ea492e884bba54e6a8e0d99cfc57123b3e821d9bc28312c86a1e72f1a\": container with ID starting with fdaa307ea492e884bba54e6a8e0d99cfc57123b3e821d9bc28312c86a1e72f1a not found: ID does not exist" containerID="fdaa307ea492e884bba54e6a8e0d99cfc57123b3e821d9bc28312c86a1e72f1a" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.734953 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdaa307ea492e884bba54e6a8e0d99cfc57123b3e821d9bc28312c86a1e72f1a"} err="failed to get container status \"fdaa307ea492e884bba54e6a8e0d99cfc57123b3e821d9bc28312c86a1e72f1a\": rpc error: code = NotFound desc = could not find container \"fdaa307ea492e884bba54e6a8e0d99cfc57123b3e821d9bc28312c86a1e72f1a\": container with ID starting with fdaa307ea492e884bba54e6a8e0d99cfc57123b3e821d9bc28312c86a1e72f1a not found: ID does not exist" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.734985 4937 scope.go:117] "RemoveContainer" containerID="76089335ac4fa0fe8873cf19d70b49ebd7b8a0769aafc4dc46234b25f3fdc49e" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.736597 4937 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.736621 4937 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.736630 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.736639 4937 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a9fc39a-301c-48eb-8ae0-238271352711-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:12 crc kubenswrapper[4937]: E0225 16:13:12.739876 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76089335ac4fa0fe8873cf19d70b49ebd7b8a0769aafc4dc46234b25f3fdc49e\": container with ID starting with 76089335ac4fa0fe8873cf19d70b49ebd7b8a0769aafc4dc46234b25f3fdc49e not found: ID does not exist" containerID="76089335ac4fa0fe8873cf19d70b49ebd7b8a0769aafc4dc46234b25f3fdc49e" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.739917 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76089335ac4fa0fe8873cf19d70b49ebd7b8a0769aafc4dc46234b25f3fdc49e"} err="failed to get container status \"76089335ac4fa0fe8873cf19d70b49ebd7b8a0769aafc4dc46234b25f3fdc49e\": rpc error: code = NotFound desc = could not find container \"76089335ac4fa0fe8873cf19d70b49ebd7b8a0769aafc4dc46234b25f3fdc49e\": container with ID starting with 76089335ac4fa0fe8873cf19d70b49ebd7b8a0769aafc4dc46234b25f3fdc49e not found: ID does not exist" Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.975529 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-589f6455c9-dwk7g"] Feb 25 16:13:12 crc kubenswrapper[4937]: I0225 16:13:12.985193 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-589f6455c9-dwk7g"] Feb 25 16:13:13 crc kubenswrapper[4937]: I0225 16:13:13.392501 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a9fc39a-301c-48eb-8ae0-238271352711" path="/var/lib/kubelet/pods/2a9fc39a-301c-48eb-8ae0-238271352711/volumes" Feb 25 16:13:13 crc kubenswrapper[4937]: I0225 16:13:13.434024 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-69bf8b4fb-nfx4x" podUID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 16:13:13 crc kubenswrapper[4937]: I0225 16:13:13.643496 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a2d638a-ffd8-4721-be54-eb6b911ffbf0","Type":"ContainerStarted","Data":"b755297cd1c0f0146497c9710ee4daaba508f5c278f65f587dba7b3a79f8473e"} Feb 25 16:13:14 crc kubenswrapper[4937]: I0225 16:13:14.086147 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:14 crc kubenswrapper[4937]: I0225 16:13:14.227159 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:14 crc kubenswrapper[4937]: I0225 16:13:14.652990 4937 generic.go:334] "Generic (PLEG): container finished" podID="4d12dac4-3aaf-41e5-aff8-68749f020d89" containerID="83fa1dba06b54c83091f7f6bfeac3df28e8fb5313d3de3b14bfb9ee33cfda3d4" exitCode=0 Feb 25 16:13:14 crc kubenswrapper[4937]: I0225 16:13:14.653103 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-qb2mq" event={"ID":"4d12dac4-3aaf-41e5-aff8-68749f020d89","Type":"ContainerDied","Data":"83fa1dba06b54c83091f7f6bfeac3df28e8fb5313d3de3b14bfb9ee33cfda3d4"} Feb 25 16:13:15 crc kubenswrapper[4937]: I0225 16:13:15.667598 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a2d638a-ffd8-4721-be54-eb6b911ffbf0","Type":"ContainerStarted","Data":"fa1d5c93a79360e663af9ea5722c354b0efee7b8d01479c5d48d69f93bb4dc9b"} Feb 25 16:13:15 crc kubenswrapper[4937]: I0225 16:13:15.667897 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 25 16:13:15 crc kubenswrapper[4937]: I0225 16:13:15.667920 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 16:13:15 crc kubenswrapper[4937]: I0225 16:13:15.711836 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.177933879 podStartE2EDuration="7.71182198s" podCreationTimestamp="2026-02-25 16:13:08 +0000 UTC" firstStartedPulling="2026-02-25 16:13:09.672290283 +0000 UTC m=+1640.685682173" lastFinishedPulling="2026-02-25 16:13:15.206178384 +0000 UTC m=+1646.219570274" observedRunningTime="2026-02-25 16:13:15.70982551 +0000 UTC m=+1646.723217400" watchObservedRunningTime="2026-02-25 16:13:15.71182198 +0000 UTC m=+1646.725213870" Feb 25 16:13:15 crc kubenswrapper[4937]: I0225 16:13:15.837755 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:15 crc kubenswrapper[4937]: I0225 16:13:15.910044 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-cv6h6"] Feb 25 16:13:15 crc kubenswrapper[4937]: I0225 16:13:15.910286 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fb745b69-cv6h6" podUID="de392ea2-2165-462b-83dc-d21599f4888b" containerName="dnsmasq-dns" containerID="cri-o://81db60abb1ddaad73fa876838afd22b15acb3e798953036b822338fc96b2d90e" gracePeriod=10 Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.251901 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.342393 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.467250 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.611337 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-config-data\") pod \"4d12dac4-3aaf-41e5-aff8-68749f020d89\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.611394 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-scripts\") pod \"4d12dac4-3aaf-41e5-aff8-68749f020d89\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.611430 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-combined-ca-bundle\") pod \"4d12dac4-3aaf-41e5-aff8-68749f020d89\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.611565 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qggwf\" (UniqueName: \"kubernetes.io/projected/4d12dac4-3aaf-41e5-aff8-68749f020d89-kube-api-access-qggwf\") pod \"4d12dac4-3aaf-41e5-aff8-68749f020d89\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.611602 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4d12dac4-3aaf-41e5-aff8-68749f020d89-certs\") pod \"4d12dac4-3aaf-41e5-aff8-68749f020d89\" (UID: \"4d12dac4-3aaf-41e5-aff8-68749f020d89\") " Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.618110 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-scripts" (OuterVolumeSpecName: "scripts") pod "4d12dac4-3aaf-41e5-aff8-68749f020d89" (UID: "4d12dac4-3aaf-41e5-aff8-68749f020d89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.620703 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d12dac4-3aaf-41e5-aff8-68749f020d89-kube-api-access-qggwf" (OuterVolumeSpecName: "kube-api-access-qggwf") pod "4d12dac4-3aaf-41e5-aff8-68749f020d89" (UID: "4d12dac4-3aaf-41e5-aff8-68749f020d89"). InnerVolumeSpecName "kube-api-access-qggwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.623651 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d12dac4-3aaf-41e5-aff8-68749f020d89-certs" (OuterVolumeSpecName: "certs") pod "4d12dac4-3aaf-41e5-aff8-68749f020d89" (UID: "4d12dac4-3aaf-41e5-aff8-68749f020d89"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.663205 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d12dac4-3aaf-41e5-aff8-68749f020d89" (UID: "4d12dac4-3aaf-41e5-aff8-68749f020d89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.663355 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-config-data" (OuterVolumeSpecName: "config-data") pod "4d12dac4-3aaf-41e5-aff8-68749f020d89" (UID: "4d12dac4-3aaf-41e5-aff8-68749f020d89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.691244 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-qb2mq" event={"ID":"4d12dac4-3aaf-41e5-aff8-68749f020d89","Type":"ContainerDied","Data":"75bfb2f4a2807303c98be1f6bc72c4ebe2da6637e63184de7a194045ef99a2aa"} Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.691298 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75bfb2f4a2807303c98be1f6bc72c4ebe2da6637e63184de7a194045ef99a2aa" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.691363 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-qb2mq" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.711734 4937 generic.go:334] "Generic (PLEG): container finished" podID="de392ea2-2165-462b-83dc-d21599f4888b" containerID="81db60abb1ddaad73fa876838afd22b15acb3e798953036b822338fc96b2d90e" exitCode=0 Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.713723 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-cv6h6" event={"ID":"de392ea2-2165-462b-83dc-d21599f4888b","Type":"ContainerDied","Data":"81db60abb1ddaad73fa876838afd22b15acb3e798953036b822338fc96b2d90e"} Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.713770 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-cv6h6" event={"ID":"de392ea2-2165-462b-83dc-d21599f4888b","Type":"ContainerDied","Data":"be945541a655da1c7a447ccd266c2e94a192275ad7aa6b57640fb9101787e3f6"} Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.713789 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be945541a655da1c7a447ccd266c2e94a192275ad7aa6b57640fb9101787e3f6" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.713918 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qggwf\" (UniqueName: \"kubernetes.io/projected/4d12dac4-3aaf-41e5-aff8-68749f020d89-kube-api-access-qggwf\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.713937 4937 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4d12dac4-3aaf-41e5-aff8-68749f020d89-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.713949 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.713960 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.713973 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d12dac4-3aaf-41e5-aff8-68749f020d89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.793987 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.801551 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.971553 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-config\") pod \"de392ea2-2165-462b-83dc-d21599f4888b\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.971804 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-dns-svc\") pod \"de392ea2-2165-462b-83dc-d21599f4888b\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.971884 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-ovsdbserver-nb\") pod \"de392ea2-2165-462b-83dc-d21599f4888b\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.971935 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-ovsdbserver-sb\") pod \"de392ea2-2165-462b-83dc-d21599f4888b\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.972085 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9vsx\" (UniqueName: \"kubernetes.io/projected/de392ea2-2165-462b-83dc-d21599f4888b-kube-api-access-n9vsx\") pod \"de392ea2-2165-462b-83dc-d21599f4888b\" (UID: \"de392ea2-2165-462b-83dc-d21599f4888b\") " Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.976643 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:13:16 crc kubenswrapper[4937]: E0225 16:13:16.981287 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a9fc39a-301c-48eb-8ae0-238271352711" containerName="neutron-api" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.991974 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a9fc39a-301c-48eb-8ae0-238271352711" containerName="neutron-api" Feb 25 16:13:16 crc kubenswrapper[4937]: E0225 16:13:16.992051 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a9fc39a-301c-48eb-8ae0-238271352711" containerName="neutron-httpd" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.992060 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a9fc39a-301c-48eb-8ae0-238271352711" containerName="neutron-httpd" Feb 25 16:13:16 crc kubenswrapper[4937]: E0225 16:13:16.992104 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de392ea2-2165-462b-83dc-d21599f4888b" containerName="init" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.992110 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="de392ea2-2165-462b-83dc-d21599f4888b" containerName="init" Feb 25 16:13:16 crc kubenswrapper[4937]: E0225 16:13:16.992133 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d12dac4-3aaf-41e5-aff8-68749f020d89" containerName="cloudkitty-storageinit" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.992140 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d12dac4-3aaf-41e5-aff8-68749f020d89" containerName="cloudkitty-storageinit" Feb 25 16:13:16 crc kubenswrapper[4937]: E0225 16:13:16.992149 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de392ea2-2165-462b-83dc-d21599f4888b" containerName="dnsmasq-dns" Feb 25 16:13:16 crc kubenswrapper[4937]: I0225 16:13:16.992156 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="de392ea2-2165-462b-83dc-d21599f4888b" containerName="dnsmasq-dns" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:16.997035 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a9fc39a-301c-48eb-8ae0-238271352711" containerName="neutron-api" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:16.997070 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="de392ea2-2165-462b-83dc-d21599f4888b" containerName="dnsmasq-dns" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:16.997097 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d12dac4-3aaf-41e5-aff8-68749f020d89" containerName="cloudkitty-storageinit" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:16.997154 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a9fc39a-301c-48eb-8ae0-238271352711" containerName="neutron-httpd" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:16.998223 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.013317 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.013640 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.013789 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-5lx8b" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.013921 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.014049 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.016089 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de392ea2-2165-462b-83dc-d21599f4888b-kube-api-access-n9vsx" (OuterVolumeSpecName: "kube-api-access-n9vsx") pod "de392ea2-2165-462b-83dc-d21599f4888b" (UID: "de392ea2-2165-462b-83dc-d21599f4888b"). InnerVolumeSpecName "kube-api-access-n9vsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.049513 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.060765 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-754k4" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="registry-server" probeResult="failure" output=< Feb 25 16:13:17 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 16:13:17 crc kubenswrapper[4937]: > Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.082406 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.082886 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-certs\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.083026 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvw8k\" (UniqueName: \"kubernetes.io/projected/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-kube-api-access-pvw8k\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.083085 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-scripts\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.083136 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-config-data\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.083154 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.083332 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9vsx\" (UniqueName: \"kubernetes.io/projected/de392ea2-2165-462b-83dc-d21599f4888b-kube-api-access-n9vsx\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.144728 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "de392ea2-2165-462b-83dc-d21599f4888b" (UID: "de392ea2-2165-462b-83dc-d21599f4888b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.150326 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-xtnr5"] Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.157819 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c757b5c5d-sqs2g" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.159679 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.175856 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-xtnr5"] Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.179006 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "de392ea2-2165-462b-83dc-d21599f4888b" (UID: "de392ea2-2165-462b-83dc-d21599f4888b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.181163 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-config" (OuterVolumeSpecName: "config") pod "de392ea2-2165-462b-83dc-d21599f4888b" (UID: "de392ea2-2165-462b-83dc-d21599f4888b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.184695 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-config\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.184744 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-dns-svc\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.184814 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.184847 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvhpp\" (UniqueName: \"kubernetes.io/projected/4cfc3af1-e6ed-4ac5-b539-822aecc38181-kube-api-access-pvhpp\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.184869 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvw8k\" (UniqueName: \"kubernetes.io/projected/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-kube-api-access-pvw8k\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.184910 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-scripts\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.184936 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-config-data\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.184952 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.185016 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.185096 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.185119 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.185139 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-certs\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.185185 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.185195 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.185205 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.194350 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.196145 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.200897 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-scripts\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.211346 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.214937 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvw8k\" (UniqueName: \"kubernetes.io/projected/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-kube-api-access-pvw8k\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.219307 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-config-data\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.222059 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.222603 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-certs\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.233138 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.256098 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "de392ea2-2165-462b-83dc-d21599f4888b" (UID: "de392ea2-2165-462b-83dc-d21599f4888b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.256557 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.287582 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.287660 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.287694 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-config\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.287718 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-dns-svc\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.287744 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-config-data\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.287761 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ee8e767-6556-4a42-9ab2-e68a10380019-logs\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.287801 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.287836 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvhpp\" (UniqueName: \"kubernetes.io/projected/4cfc3af1-e6ed-4ac5-b539-822aecc38181-kube-api-access-pvhpp\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.287850 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-scripts\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.287886 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9ee8e767-6556-4a42-9ab2-e68a10380019-certs\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.287965 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjt5d\" (UniqueName: \"kubernetes.io/projected/9ee8e767-6556-4a42-9ab2-e68a10380019-kube-api-access-fjt5d\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.288023 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.288072 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.288126 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de392ea2-2165-462b-83dc-d21599f4888b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.288861 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.289358 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.290103 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-config\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.290699 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-dns-svc\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.291239 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.343539 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvhpp\" (UniqueName: \"kubernetes.io/projected/4cfc3af1-e6ed-4ac5-b539-822aecc38181-kube-api-access-pvhpp\") pod \"dnsmasq-dns-67bdc55879-xtnr5\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.380309 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-69bf8b4fb-nfx4x"] Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.380583 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-69bf8b4fb-nfx4x" podUID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" containerName="barbican-api-log" containerID="cri-o://17dcd3c571d10ecb5fbeba2d96fa4f01b7d503a85b38e859b46e71d755d087e5" gracePeriod=30 Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.381919 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-69bf8b4fb-nfx4x" podUID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" containerName="barbican-api" containerID="cri-o://a453a521d7da1667622e4fa12753c941359a7de17fd65b93adc1a0a83ecd77e1" gracePeriod=30 Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.389704 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.389784 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-config-data\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.389811 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ee8e767-6556-4a42-9ab2-e68a10380019-logs\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.389856 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-scripts\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.389888 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9ee8e767-6556-4a42-9ab2-e68a10380019-certs\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.389937 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjt5d\" (UniqueName: \"kubernetes.io/projected/9ee8e767-6556-4a42-9ab2-e68a10380019-kube-api-access-fjt5d\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.389989 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.390873 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ee8e767-6556-4a42-9ab2-e68a10380019-logs\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.399038 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9ee8e767-6556-4a42-9ab2-e68a10380019-certs\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.408380 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.410356 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-config-data\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.413043 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-scripts\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.417691 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.422171 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-69bf8b4fb-nfx4x" podUID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.185:9311/healthcheck\": EOF" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.422723 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjt5d\" (UniqueName: \"kubernetes.io/projected/9ee8e767-6556-4a42-9ab2-e68a10380019-kube-api-access-fjt5d\") pod \"cloudkitty-api-0\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.466952 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.501842 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.560437 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.731470 4937 generic.go:334] "Generic (PLEG): container finished" podID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" containerID="17dcd3c571d10ecb5fbeba2d96fa4f01b7d503a85b38e859b46e71d755d087e5" exitCode=143 Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.731844 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-cv6h6" Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.732614 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69bf8b4fb-nfx4x" event={"ID":"8ba5074b-5bcc-4028-9fe9-7ea203f32b25","Type":"ContainerDied","Data":"17dcd3c571d10ecb5fbeba2d96fa4f01b7d503a85b38e859b46e71d755d087e5"} Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.732986 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ad34bc64-7581-4b37-8b13-b2fbdbb6e901" containerName="cinder-scheduler" containerID="cri-o://b1d82a12623c050ad7171bce388331a3044d60a635fda8eab8be7f40208d6175" gracePeriod=30 Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.733164 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ad34bc64-7581-4b37-8b13-b2fbdbb6e901" containerName="probe" containerID="cri-o://9bb2f38664972759ef6d2f7a0dd9f198d87ecb83a270a169060d72002b2d0835" gracePeriod=30 Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.787529 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-cv6h6"] Feb 25 16:13:17 crc kubenswrapper[4937]: I0225 16:13:17.800660 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-cv6h6"] Feb 25 16:13:18 crc kubenswrapper[4937]: I0225 16:13:18.591459 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-xtnr5"] Feb 25 16:13:18 crc kubenswrapper[4937]: I0225 16:13:18.759560 4937 generic.go:334] "Generic (PLEG): container finished" podID="ad34bc64-7581-4b37-8b13-b2fbdbb6e901" containerID="9bb2f38664972759ef6d2f7a0dd9f198d87ecb83a270a169060d72002b2d0835" exitCode=0 Feb 25 16:13:18 crc kubenswrapper[4937]: I0225 16:13:18.759622 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ad34bc64-7581-4b37-8b13-b2fbdbb6e901","Type":"ContainerDied","Data":"9bb2f38664972759ef6d2f7a0dd9f198d87ecb83a270a169060d72002b2d0835"} Feb 25 16:13:18 crc kubenswrapper[4937]: I0225 16:13:18.769639 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" event={"ID":"4cfc3af1-e6ed-4ac5-b539-822aecc38181","Type":"ContainerStarted","Data":"77ffc8fbb64ad75e0108e3b6f483e81c97a1703acb4d7bbdaa0d3557817e2628"} Feb 25 16:13:18 crc kubenswrapper[4937]: I0225 16:13:18.784346 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:13:18 crc kubenswrapper[4937]: I0225 16:13:18.799098 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:13:19 crc kubenswrapper[4937]: I0225 16:13:19.386156 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de392ea2-2165-462b-83dc-d21599f4888b" path="/var/lib/kubelet/pods/de392ea2-2165-462b-83dc-d21599f4888b/volumes" Feb 25 16:13:19 crc kubenswrapper[4937]: I0225 16:13:19.808695 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"af6a9d4b-1995-4a50-bc72-83bbebe6c32b","Type":"ContainerStarted","Data":"3ab1a2c8e1fc185a2ac153e6674ea551b0b3f9b194c210bc7cc7f4dcd1f990c3"} Feb 25 16:13:19 crc kubenswrapper[4937]: I0225 16:13:19.825884 4937 generic.go:334] "Generic (PLEG): container finished" podID="ad34bc64-7581-4b37-8b13-b2fbdbb6e901" containerID="b1d82a12623c050ad7171bce388331a3044d60a635fda8eab8be7f40208d6175" exitCode=0 Feb 25 16:13:19 crc kubenswrapper[4937]: I0225 16:13:19.825953 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ad34bc64-7581-4b37-8b13-b2fbdbb6e901","Type":"ContainerDied","Data":"b1d82a12623c050ad7171bce388331a3044d60a635fda8eab8be7f40208d6175"} Feb 25 16:13:19 crc kubenswrapper[4937]: I0225 16:13:19.834049 4937 generic.go:334] "Generic (PLEG): container finished" podID="4cfc3af1-e6ed-4ac5-b539-822aecc38181" containerID="ed872c321f61826d18ac43d6421bd6fc0d3e442e201ba2de37e63f957b6528b6" exitCode=0 Feb 25 16:13:19 crc kubenswrapper[4937]: I0225 16:13:19.834259 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" event={"ID":"4cfc3af1-e6ed-4ac5-b539-822aecc38181","Type":"ContainerDied","Data":"ed872c321f61826d18ac43d6421bd6fc0d3e442e201ba2de37e63f957b6528b6"} Feb 25 16:13:19 crc kubenswrapper[4937]: I0225 16:13:19.860389 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"9ee8e767-6556-4a42-9ab2-e68a10380019","Type":"ContainerStarted","Data":"b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b"} Feb 25 16:13:19 crc kubenswrapper[4937]: I0225 16:13:19.860689 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"9ee8e767-6556-4a42-9ab2-e68a10380019","Type":"ContainerStarted","Data":"80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732"} Feb 25 16:13:19 crc kubenswrapper[4937]: I0225 16:13:19.860700 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"9ee8e767-6556-4a42-9ab2-e68a10380019","Type":"ContainerStarted","Data":"baf47360558e8938886217738f2d987aae09716814ece2ba3921802b88a445ba"} Feb 25 16:13:19 crc kubenswrapper[4937]: I0225 16:13:19.861555 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 25 16:13:19 crc kubenswrapper[4937]: I0225 16:13:19.921704 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.921686586 podStartE2EDuration="2.921686586s" podCreationTimestamp="2026-02-25 16:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:19.899467659 +0000 UTC m=+1650.912859549" watchObservedRunningTime="2026-02-25 16:13:19.921686586 +0000 UTC m=+1650.935078476" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.314096 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.380148 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx8z8\" (UniqueName: \"kubernetes.io/projected/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-kube-api-access-rx8z8\") pod \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.380201 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-scripts\") pod \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.380367 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-etc-machine-id\") pod \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.380542 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-config-data\") pod \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.380570 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-combined-ca-bundle\") pod \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.380603 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-config-data-custom\") pod \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\" (UID: \"ad34bc64-7581-4b37-8b13-b2fbdbb6e901\") " Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.390649 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ad34bc64-7581-4b37-8b13-b2fbdbb6e901" (UID: "ad34bc64-7581-4b37-8b13-b2fbdbb6e901"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.391588 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ad34bc64-7581-4b37-8b13-b2fbdbb6e901" (UID: "ad34bc64-7581-4b37-8b13-b2fbdbb6e901"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.403032 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-kube-api-access-rx8z8" (OuterVolumeSpecName: "kube-api-access-rx8z8") pod "ad34bc64-7581-4b37-8b13-b2fbdbb6e901" (UID: "ad34bc64-7581-4b37-8b13-b2fbdbb6e901"). InnerVolumeSpecName "kube-api-access-rx8z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.407350 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-scripts" (OuterVolumeSpecName: "scripts") pod "ad34bc64-7581-4b37-8b13-b2fbdbb6e901" (UID: "ad34bc64-7581-4b37-8b13-b2fbdbb6e901"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.482949 4937 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.482978 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx8z8\" (UniqueName: \"kubernetes.io/projected/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-kube-api-access-rx8z8\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.482988 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.482996 4937 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.595416 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad34bc64-7581-4b37-8b13-b2fbdbb6e901" (UID: "ad34bc64-7581-4b37-8b13-b2fbdbb6e901"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.629724 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-config-data" (OuterVolumeSpecName: "config-data") pod "ad34bc64-7581-4b37-8b13-b2fbdbb6e901" (UID: "ad34bc64-7581-4b37-8b13-b2fbdbb6e901"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.635447 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.687261 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.687294 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad34bc64-7581-4b37-8b13-b2fbdbb6e901-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.730460 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.882834 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ad34bc64-7581-4b37-8b13-b2fbdbb6e901","Type":"ContainerDied","Data":"07ea9d6d98f1956ccdf39c60d023f82b464bd6b5182280c7ad25eebcae28a867"} Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.882868 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.882884 4937 scope.go:117] "RemoveContainer" containerID="9bb2f38664972759ef6d2f7a0dd9f198d87ecb83a270a169060d72002b2d0835" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.899758 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" event={"ID":"4cfc3af1-e6ed-4ac5-b539-822aecc38181","Type":"ContainerStarted","Data":"21e7f388a9ba2cddfa57fd9bd86c95641b59d5fc846737f44344255bcf1a8ac6"} Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.899792 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.928675 4937 scope.go:117] "RemoveContainer" containerID="b1d82a12623c050ad7171bce388331a3044d60a635fda8eab8be7f40208d6175" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.930524 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" podStartSLOduration=3.930507547 podStartE2EDuration="3.930507547s" podCreationTimestamp="2026-02-25 16:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:20.921890211 +0000 UTC m=+1651.935282121" watchObservedRunningTime="2026-02-25 16:13:20.930507547 +0000 UTC m=+1651.943899437" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.969094 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.991554 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.996344 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 16:13:20 crc kubenswrapper[4937]: E0225 16:13:20.996782 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad34bc64-7581-4b37-8b13-b2fbdbb6e901" containerName="cinder-scheduler" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.996798 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad34bc64-7581-4b37-8b13-b2fbdbb6e901" containerName="cinder-scheduler" Feb 25 16:13:20 crc kubenswrapper[4937]: E0225 16:13:20.996815 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad34bc64-7581-4b37-8b13-b2fbdbb6e901" containerName="probe" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.996823 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad34bc64-7581-4b37-8b13-b2fbdbb6e901" containerName="probe" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.997016 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad34bc64-7581-4b37-8b13-b2fbdbb6e901" containerName="cinder-scheduler" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.997045 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad34bc64-7581-4b37-8b13-b2fbdbb6e901" containerName="probe" Feb 25 16:13:20 crc kubenswrapper[4937]: I0225 16:13:20.998101 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.001516 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.004298 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.096307 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6b49871-d3a3-4846-9c5d-3df7b920a420-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.096356 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b49871-d3a3-4846-9c5d-3df7b920a420-config-data\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.096386 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6b49871-d3a3-4846-9c5d-3df7b920a420-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.096449 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnn4t\" (UniqueName: \"kubernetes.io/projected/f6b49871-d3a3-4846-9c5d-3df7b920a420-kube-api-access-lnn4t\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.096536 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6b49871-d3a3-4846-9c5d-3df7b920a420-scripts\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.096561 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b49871-d3a3-4846-9c5d-3df7b920a420-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.198644 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6b49871-d3a3-4846-9c5d-3df7b920a420-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.198701 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6b49871-d3a3-4846-9c5d-3df7b920a420-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.198720 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b49871-d3a3-4846-9c5d-3df7b920a420-config-data\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.198790 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnn4t\" (UniqueName: \"kubernetes.io/projected/f6b49871-d3a3-4846-9c5d-3df7b920a420-kube-api-access-lnn4t\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.198830 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6b49871-d3a3-4846-9c5d-3df7b920a420-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.198862 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6b49871-d3a3-4846-9c5d-3df7b920a420-scripts\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.198937 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b49871-d3a3-4846-9c5d-3df7b920a420-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.202335 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6b49871-d3a3-4846-9c5d-3df7b920a420-scripts\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.202708 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6b49871-d3a3-4846-9c5d-3df7b920a420-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.203943 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b49871-d3a3-4846-9c5d-3df7b920a420-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.204072 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b49871-d3a3-4846-9c5d-3df7b920a420-config-data\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.215810 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnn4t\" (UniqueName: \"kubernetes.io/projected/f6b49871-d3a3-4846-9c5d-3df7b920a420-kube-api-access-lnn4t\") pod \"cinder-scheduler-0\" (UID: \"f6b49871-d3a3-4846-9c5d-3df7b920a420\") " pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.331902 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.332746 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.425053 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad34bc64-7581-4b37-8b13-b2fbdbb6e901" path="/var/lib/kubelet/pods/ad34bc64-7581-4b37-8b13-b2fbdbb6e901/volumes" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.426616 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.919853 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="9ee8e767-6556-4a42-9ab2-e68a10380019" containerName="cloudkitty-api-log" containerID="cri-o://80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732" gracePeriod=30 Feb 25 16:13:21 crc kubenswrapper[4937]: I0225 16:13:21.920278 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="9ee8e767-6556-4a42-9ab2-e68a10380019" containerName="cloudkitty-api" containerID="cri-o://b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b" gracePeriod=30 Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.322230 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7595479948-g6dtl" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.647589 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.685268 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.687107 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.689451 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.689454 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.690578 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-8d68v" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.739627 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.750007 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3105ae01-7d47-4714-90ae-d17157c01e91-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.750202 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5tfl\" (UniqueName: \"kubernetes.io/projected/3105ae01-7d47-4714-90ae-d17157c01e91-kube-api-access-d5tfl\") pod \"openstackclient\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.750257 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3105ae01-7d47-4714-90ae-d17157c01e91-openstack-config\") pod \"openstackclient\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.750408 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3105ae01-7d47-4714-90ae-d17157c01e91-openstack-config-secret\") pod \"openstackclient\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.851753 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.853398 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3105ae01-7d47-4714-90ae-d17157c01e91-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.853613 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5tfl\" (UniqueName: \"kubernetes.io/projected/3105ae01-7d47-4714-90ae-d17157c01e91-kube-api-access-d5tfl\") pod \"openstackclient\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.853994 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3105ae01-7d47-4714-90ae-d17157c01e91-openstack-config\") pod \"openstackclient\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.854796 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3105ae01-7d47-4714-90ae-d17157c01e91-openstack-config\") pod \"openstackclient\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.854989 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3105ae01-7d47-4714-90ae-d17157c01e91-openstack-config-secret\") pod \"openstackclient\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.858303 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3105ae01-7d47-4714-90ae-d17157c01e91-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.874058 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5tfl\" (UniqueName: \"kubernetes.io/projected/3105ae01-7d47-4714-90ae-d17157c01e91-kube-api-access-d5tfl\") pod \"openstackclient\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.885370 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3105ae01-7d47-4714-90ae-d17157c01e91-openstack-config-secret\") pod \"openstackclient\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.913922 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.914737 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.941304 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.955063 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 25 16:13:22 crc kubenswrapper[4937]: E0225 16:13:22.955599 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee8e767-6556-4a42-9ab2-e68a10380019" containerName="cloudkitty-api" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.955614 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee8e767-6556-4a42-9ab2-e68a10380019" containerName="cloudkitty-api" Feb 25 16:13:22 crc kubenswrapper[4937]: E0225 16:13:22.955633 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee8e767-6556-4a42-9ab2-e68a10380019" containerName="cloudkitty-api-log" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.955641 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee8e767-6556-4a42-9ab2-e68a10380019" containerName="cloudkitty-api-log" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.955850 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee8e767-6556-4a42-9ab2-e68a10380019" containerName="cloudkitty-api" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.955888 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee8e767-6556-4a42-9ab2-e68a10380019" containerName="cloudkitty-api-log" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.956855 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.958242 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjt5d\" (UniqueName: \"kubernetes.io/projected/9ee8e767-6556-4a42-9ab2-e68a10380019-kube-api-access-fjt5d\") pod \"9ee8e767-6556-4a42-9ab2-e68a10380019\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.958312 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-scripts\") pod \"9ee8e767-6556-4a42-9ab2-e68a10380019\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.958373 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9ee8e767-6556-4a42-9ab2-e68a10380019-certs\") pod \"9ee8e767-6556-4a42-9ab2-e68a10380019\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.958429 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-config-data-custom\") pod \"9ee8e767-6556-4a42-9ab2-e68a10380019\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.958584 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-config-data\") pod \"9ee8e767-6556-4a42-9ab2-e68a10380019\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.958689 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-combined-ca-bundle\") pod \"9ee8e767-6556-4a42-9ab2-e68a10380019\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.958858 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ee8e767-6556-4a42-9ab2-e68a10380019-logs\") pod \"9ee8e767-6556-4a42-9ab2-e68a10380019\" (UID: \"9ee8e767-6556-4a42-9ab2-e68a10380019\") " Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.959823 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee8e767-6556-4a42-9ab2-e68a10380019-logs" (OuterVolumeSpecName: "logs") pod "9ee8e767-6556-4a42-9ab2-e68a10380019" (UID: "9ee8e767-6556-4a42-9ab2-e68a10380019"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.972010 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"af6a9d4b-1995-4a50-bc72-83bbebe6c32b","Type":"ContainerStarted","Data":"7d23e1fca2bc309538bfa704762a60fbe8ace734ef7787d59bfc9c6edf9200a9"} Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.978055 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee8e767-6556-4a42-9ab2-e68a10380019-certs" (OuterVolumeSpecName: "certs") pod "9ee8e767-6556-4a42-9ab2-e68a10380019" (UID: "9ee8e767-6556-4a42-9ab2-e68a10380019"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.980248 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee8e767-6556-4a42-9ab2-e68a10380019-kube-api-access-fjt5d" (OuterVolumeSpecName: "kube-api-access-fjt5d") pod "9ee8e767-6556-4a42-9ab2-e68a10380019" (UID: "9ee8e767-6556-4a42-9ab2-e68a10380019"). InnerVolumeSpecName "kube-api-access-fjt5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.983168 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.983632 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9ee8e767-6556-4a42-9ab2-e68a10380019" (UID: "9ee8e767-6556-4a42-9ab2-e68a10380019"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.983708 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-scripts" (OuterVolumeSpecName: "scripts") pod "9ee8e767-6556-4a42-9ab2-e68a10380019" (UID: "9ee8e767-6556-4a42-9ab2-e68a10380019"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.984867 4937 generic.go:334] "Generic (PLEG): container finished" podID="9ee8e767-6556-4a42-9ab2-e68a10380019" containerID="b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b" exitCode=0 Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.984914 4937 generic.go:334] "Generic (PLEG): container finished" podID="9ee8e767-6556-4a42-9ab2-e68a10380019" containerID="80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732" exitCode=143 Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.985028 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.985184 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"9ee8e767-6556-4a42-9ab2-e68a10380019","Type":"ContainerDied","Data":"b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b"} Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.985242 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"9ee8e767-6556-4a42-9ab2-e68a10380019","Type":"ContainerDied","Data":"80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732"} Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.985259 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"9ee8e767-6556-4a42-9ab2-e68a10380019","Type":"ContainerDied","Data":"baf47360558e8938886217738f2d987aae09716814ece2ba3921802b88a445ba"} Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.985279 4937 scope.go:117] "RemoveContainer" containerID="b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b" Feb 25 16:13:22 crc kubenswrapper[4937]: I0225 16:13:22.992812 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f6b49871-d3a3-4846-9c5d-3df7b920a420","Type":"ContainerStarted","Data":"478fccb5c6f63750622015e4e08938cc7d996259ea73c18157a97016273f96fa"} Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.017756 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=3.868500133 podStartE2EDuration="7.0177392s" podCreationTimestamp="2026-02-25 16:13:16 +0000 UTC" firstStartedPulling="2026-02-25 16:13:18.832572984 +0000 UTC m=+1649.845964874" lastFinishedPulling="2026-02-25 16:13:21.981812051 +0000 UTC m=+1652.995203941" observedRunningTime="2026-02-25 16:13:23.013183996 +0000 UTC m=+1654.026575886" watchObservedRunningTime="2026-02-25 16:13:23.0177392 +0000 UTC m=+1654.031131090" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.036691 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.051640 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-config-data" (OuterVolumeSpecName: "config-data") pod "9ee8e767-6556-4a42-9ab2-e68a10380019" (UID: "9ee8e767-6556-4a42-9ab2-e68a10380019"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.054406 4937 scope.go:117] "RemoveContainer" containerID="80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.062760 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a03635e4-24a3-460b-ab0e-e3f677ac95c5-openstack-config\") pod \"openstackclient\" (UID: \"a03635e4-24a3-460b-ab0e-e3f677ac95c5\") " pod="openstack/openstackclient" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.062839 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03635e4-24a3-460b-ab0e-e3f677ac95c5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a03635e4-24a3-460b-ab0e-e3f677ac95c5\") " pod="openstack/openstackclient" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.062981 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmfwh\" (UniqueName: \"kubernetes.io/projected/a03635e4-24a3-460b-ab0e-e3f677ac95c5-kube-api-access-hmfwh\") pod \"openstackclient\" (UID: \"a03635e4-24a3-460b-ab0e-e3f677ac95c5\") " pod="openstack/openstackclient" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.063035 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a03635e4-24a3-460b-ab0e-e3f677ac95c5-openstack-config-secret\") pod \"openstackclient\" (UID: \"a03635e4-24a3-460b-ab0e-e3f677ac95c5\") " pod="openstack/openstackclient" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.063195 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.063214 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ee8e767-6556-4a42-9ab2-e68a10380019-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.063223 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjt5d\" (UniqueName: \"kubernetes.io/projected/9ee8e767-6556-4a42-9ab2-e68a10380019-kube-api-access-fjt5d\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.063232 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.063240 4937 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9ee8e767-6556-4a42-9ab2-e68a10380019-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.063252 4937 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.067771 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ee8e767-6556-4a42-9ab2-e68a10380019" (UID: "9ee8e767-6556-4a42-9ab2-e68a10380019"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.079811 4937 scope.go:117] "RemoveContainer" containerID="b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b" Feb 25 16:13:23 crc kubenswrapper[4937]: E0225 16:13:23.091720 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b\": container with ID starting with b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b not found: ID does not exist" containerID="b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.091777 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b"} err="failed to get container status \"b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b\": rpc error: code = NotFound desc = could not find container \"b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b\": container with ID starting with b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b not found: ID does not exist" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.091803 4937 scope.go:117] "RemoveContainer" containerID="80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732" Feb 25 16:13:23 crc kubenswrapper[4937]: E0225 16:13:23.095685 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732\": container with ID starting with 80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732 not found: ID does not exist" containerID="80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.095729 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732"} err="failed to get container status \"80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732\": rpc error: code = NotFound desc = could not find container \"80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732\": container with ID starting with 80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732 not found: ID does not exist" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.095761 4937 scope.go:117] "RemoveContainer" containerID="b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.096244 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b"} err="failed to get container status \"b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b\": rpc error: code = NotFound desc = could not find container \"b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b\": container with ID starting with b6caabf8a89755f6acbac5233b829770a1dcb5ae7940dda97dbcf996e470b01b not found: ID does not exist" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.096264 4937 scope.go:117] "RemoveContainer" containerID="80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.096575 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732"} err="failed to get container status \"80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732\": rpc error: code = NotFound desc = could not find container \"80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732\": container with ID starting with 80d4c690a34b10d0b8a1138fb0329501dfb86fb686d5edc80a4db55022973732 not found: ID does not exist" Feb 25 16:13:23 crc kubenswrapper[4937]: E0225 16:13:23.103001 4937 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 25 16:13:23 crc kubenswrapper[4937]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_3105ae01-7d47-4714-90ae-d17157c01e91_0(b595c0ef14fcab5028e197f31880762305c47d6c89ac924a6180ed2a31783508): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b595c0ef14fcab5028e197f31880762305c47d6c89ac924a6180ed2a31783508" Netns:"/var/run/netns/4dc8695a-95f3-4139-b612-70bf91e5a755" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=b595c0ef14fcab5028e197f31880762305c47d6c89ac924a6180ed2a31783508;K8S_POD_UID=3105ae01-7d47-4714-90ae-d17157c01e91" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/3105ae01-7d47-4714-90ae-d17157c01e91]: expected pod UID "3105ae01-7d47-4714-90ae-d17157c01e91" but got "a03635e4-24a3-460b-ab0e-e3f677ac95c5" from Kube API Feb 25 16:13:23 crc kubenswrapper[4937]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 25 16:13:23 crc kubenswrapper[4937]: > Feb 25 16:13:23 crc kubenswrapper[4937]: E0225 16:13:23.103062 4937 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 25 16:13:23 crc kubenswrapper[4937]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_3105ae01-7d47-4714-90ae-d17157c01e91_0(b595c0ef14fcab5028e197f31880762305c47d6c89ac924a6180ed2a31783508): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b595c0ef14fcab5028e197f31880762305c47d6c89ac924a6180ed2a31783508" Netns:"/var/run/netns/4dc8695a-95f3-4139-b612-70bf91e5a755" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=b595c0ef14fcab5028e197f31880762305c47d6c89ac924a6180ed2a31783508;K8S_POD_UID=3105ae01-7d47-4714-90ae-d17157c01e91" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/3105ae01-7d47-4714-90ae-d17157c01e91]: expected pod UID "3105ae01-7d47-4714-90ae-d17157c01e91" but got "a03635e4-24a3-460b-ab0e-e3f677ac95c5" from Kube API Feb 25 16:13:23 crc kubenswrapper[4937]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 25 16:13:23 crc kubenswrapper[4937]: > pod="openstack/openstackclient" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.165467 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmfwh\" (UniqueName: \"kubernetes.io/projected/a03635e4-24a3-460b-ab0e-e3f677ac95c5-kube-api-access-hmfwh\") pod \"openstackclient\" (UID: \"a03635e4-24a3-460b-ab0e-e3f677ac95c5\") " pod="openstack/openstackclient" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.165534 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a03635e4-24a3-460b-ab0e-e3f677ac95c5-openstack-config-secret\") pod \"openstackclient\" (UID: \"a03635e4-24a3-460b-ab0e-e3f677ac95c5\") " pod="openstack/openstackclient" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.165601 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a03635e4-24a3-460b-ab0e-e3f677ac95c5-openstack-config\") pod \"openstackclient\" (UID: \"a03635e4-24a3-460b-ab0e-e3f677ac95c5\") " pod="openstack/openstackclient" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.165643 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03635e4-24a3-460b-ab0e-e3f677ac95c5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a03635e4-24a3-460b-ab0e-e3f677ac95c5\") " pod="openstack/openstackclient" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.165725 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee8e767-6556-4a42-9ab2-e68a10380019-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.166766 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a03635e4-24a3-460b-ab0e-e3f677ac95c5-openstack-config\") pod \"openstackclient\" (UID: \"a03635e4-24a3-460b-ab0e-e3f677ac95c5\") " pod="openstack/openstackclient" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.171954 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03635e4-24a3-460b-ab0e-e3f677ac95c5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a03635e4-24a3-460b-ab0e-e3f677ac95c5\") " pod="openstack/openstackclient" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.176941 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a03635e4-24a3-460b-ab0e-e3f677ac95c5-openstack-config-secret\") pod \"openstackclient\" (UID: \"a03635e4-24a3-460b-ab0e-e3f677ac95c5\") " pod="openstack/openstackclient" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.179788 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmfwh\" (UniqueName: \"kubernetes.io/projected/a03635e4-24a3-460b-ab0e-e3f677ac95c5-kube-api-access-hmfwh\") pod \"openstackclient\" (UID: \"a03635e4-24a3-460b-ab0e-e3f677ac95c5\") " pod="openstack/openstackclient" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.305714 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.343069 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.365691 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.429749 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ee8e767-6556-4a42-9ab2-e68a10380019" path="/var/lib/kubelet/pods/9ee8e767-6556-4a42-9ab2-e68a10380019/volumes" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.430826 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.433974 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.436828 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.437547 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.438132 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.452045 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.471897 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49t5d\" (UniqueName: \"kubernetes.io/projected/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-kube-api-access-49t5d\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.472011 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.472083 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.472100 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-scripts\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.472119 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-logs\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.472138 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.472260 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-config-data\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.472352 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.472474 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-certs\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.603446 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49t5d\" (UniqueName: \"kubernetes.io/projected/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-kube-api-access-49t5d\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.603850 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.603984 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.604007 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-scripts\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.604038 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-logs\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.604070 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.604101 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-config-data\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.604166 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.604234 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-certs\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.608193 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-logs\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.615892 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-scripts\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.616555 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-certs\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.618350 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.618608 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-config-data\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.618618 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.623096 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.624726 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.649099 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49t5d\" (UniqueName: \"kubernetes.io/projected/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-kube-api-access-49t5d\") pod \"cloudkitty-api-0\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.783114 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.952952 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.959044 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69bf8b4fb-nfx4x" podUID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.185:9311/healthcheck\": read tcp 10.217.0.2:56700->10.217.0.185:9311: read: connection reset by peer" Feb 25 16:13:23 crc kubenswrapper[4937]: I0225 16:13:23.959827 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69bf8b4fb-nfx4x" podUID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.185:9311/healthcheck\": read tcp 10.217.0.2:56690->10.217.0.185:9311: read: connection reset by peer" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.038696 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a03635e4-24a3-460b-ab0e-e3f677ac95c5","Type":"ContainerStarted","Data":"701661d48cc3c994ede189033fc02130e182827ad8d077b785f5e88c53daaf99"} Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.047945 4937 generic.go:334] "Generic (PLEG): container finished" podID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" containerID="a453a521d7da1667622e4fa12753c941359a7de17fd65b93adc1a0a83ecd77e1" exitCode=0 Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.048030 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69bf8b4fb-nfx4x" event={"ID":"8ba5074b-5bcc-4028-9fe9-7ea203f32b25","Type":"ContainerDied","Data":"a453a521d7da1667622e4fa12753c941359a7de17fd65b93adc1a0a83ecd77e1"} Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.056338 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f6b49871-d3a3-4846-9c5d-3df7b920a420","Type":"ContainerStarted","Data":"c0f66cd945489c9810d3640717ddbebad9730e4a22eb4072d135ad2d19ddf7cc"} Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.056477 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.071928 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.075415 4937 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3105ae01-7d47-4714-90ae-d17157c01e91" podUID="a03635e4-24a3-460b-ab0e-e3f677ac95c5" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.125852 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3105ae01-7d47-4714-90ae-d17157c01e91-openstack-config\") pod \"3105ae01-7d47-4714-90ae-d17157c01e91\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.125955 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5tfl\" (UniqueName: \"kubernetes.io/projected/3105ae01-7d47-4714-90ae-d17157c01e91-kube-api-access-d5tfl\") pod \"3105ae01-7d47-4714-90ae-d17157c01e91\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.126019 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3105ae01-7d47-4714-90ae-d17157c01e91-combined-ca-bundle\") pod \"3105ae01-7d47-4714-90ae-d17157c01e91\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.126042 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3105ae01-7d47-4714-90ae-d17157c01e91-openstack-config-secret\") pod \"3105ae01-7d47-4714-90ae-d17157c01e91\" (UID: \"3105ae01-7d47-4714-90ae-d17157c01e91\") " Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.126990 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3105ae01-7d47-4714-90ae-d17157c01e91-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "3105ae01-7d47-4714-90ae-d17157c01e91" (UID: "3105ae01-7d47-4714-90ae-d17157c01e91"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.132788 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3105ae01-7d47-4714-90ae-d17157c01e91-kube-api-access-d5tfl" (OuterVolumeSpecName: "kube-api-access-d5tfl") pod "3105ae01-7d47-4714-90ae-d17157c01e91" (UID: "3105ae01-7d47-4714-90ae-d17157c01e91"). InnerVolumeSpecName "kube-api-access-d5tfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.136950 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3105ae01-7d47-4714-90ae-d17157c01e91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3105ae01-7d47-4714-90ae-d17157c01e91" (UID: "3105ae01-7d47-4714-90ae-d17157c01e91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.140451 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3105ae01-7d47-4714-90ae-d17157c01e91-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "3105ae01-7d47-4714-90ae-d17157c01e91" (UID: "3105ae01-7d47-4714-90ae-d17157c01e91"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.228553 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5tfl\" (UniqueName: \"kubernetes.io/projected/3105ae01-7d47-4714-90ae-d17157c01e91-kube-api-access-d5tfl\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.228600 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3105ae01-7d47-4714-90ae-d17157c01e91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.228613 4937 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3105ae01-7d47-4714-90ae-d17157c01e91-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.228625 4937 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3105ae01-7d47-4714-90ae-d17157c01e91-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.370397 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.614478 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.738921 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwlgb\" (UniqueName: \"kubernetes.io/projected/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-kube-api-access-mwlgb\") pod \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.739252 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-config-data\") pod \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.739322 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-logs\") pod \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.739451 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-config-data-custom\") pod \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.739536 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-combined-ca-bundle\") pod \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\" (UID: \"8ba5074b-5bcc-4028-9fe9-7ea203f32b25\") " Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.740322 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-logs" (OuterVolumeSpecName: "logs") pod "8ba5074b-5bcc-4028-9fe9-7ea203f32b25" (UID: "8ba5074b-5bcc-4028-9fe9-7ea203f32b25"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.754048 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8ba5074b-5bcc-4028-9fe9-7ea203f32b25" (UID: "8ba5074b-5bcc-4028-9fe9-7ea203f32b25"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.776447 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-kube-api-access-mwlgb" (OuterVolumeSpecName: "kube-api-access-mwlgb") pod "8ba5074b-5bcc-4028-9fe9-7ea203f32b25" (UID: "8ba5074b-5bcc-4028-9fe9-7ea203f32b25"). InnerVolumeSpecName "kube-api-access-mwlgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.780893 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ba5074b-5bcc-4028-9fe9-7ea203f32b25" (UID: "8ba5074b-5bcc-4028-9fe9-7ea203f32b25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.800861 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.808456 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-config-data" (OuterVolumeSpecName: "config-data") pod "8ba5074b-5bcc-4028-9fe9-7ea203f32b25" (UID: "8ba5074b-5bcc-4028-9fe9-7ea203f32b25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.841633 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.841659 4937 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.841669 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.841678 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwlgb\" (UniqueName: \"kubernetes.io/projected/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-kube-api-access-mwlgb\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.841686 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba5074b-5bcc-4028-9fe9-7ea203f32b25-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.884402 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-575b75bdd-mz6p6" Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.965963 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5bc476bcbd-vwgwx"] Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.967557 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5bc476bcbd-vwgwx" podUID="443f8b26-eac8-403e-b311-07cf1cd7cb83" containerName="placement-log" containerID="cri-o://03ecb7f01fc0cd1c9e12aaa6747d29e34de1ecbc9b48b2ce60eec63988b7f37e" gracePeriod=30 Feb 25 16:13:24 crc kubenswrapper[4937]: I0225 16:13:24.968592 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5bc476bcbd-vwgwx" podUID="443f8b26-eac8-403e-b311-07cf1cd7cb83" containerName="placement-api" containerID="cri-o://78644883a09ab939d41e6f50c4dfff46be349cce6856355a8ab86ae21f7e9023" gracePeriod=30 Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.138071 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69bf8b4fb-nfx4x" event={"ID":"8ba5074b-5bcc-4028-9fe9-7ea203f32b25","Type":"ContainerDied","Data":"f6b2607ac85e9b653ed3d4f242c71b568d525aa1ca72ebcfade6b7ac8809b92e"} Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.138127 4937 scope.go:117] "RemoveContainer" containerID="a453a521d7da1667622e4fa12753c941359a7de17fd65b93adc1a0a83ecd77e1" Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.138287 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69bf8b4fb-nfx4x" Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.153789 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2577b339-c9c0-4e63-afa1-c0b2fb7177b4","Type":"ContainerStarted","Data":"40453a9b19a803b2d1da79a4068132eda37db5e9c992255359e0e584ea855f8c"} Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.153843 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2577b339-c9c0-4e63-afa1-c0b2fb7177b4","Type":"ContainerStarted","Data":"20c0dee27177aadaf0e1776572c1df5c65beebb2c40617a80b58ad229cd71592"} Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.167298 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="af6a9d4b-1995-4a50-bc72-83bbebe6c32b" containerName="cloudkitty-proc" containerID="cri-o://7d23e1fca2bc309538bfa704762a60fbe8ace734ef7787d59bfc9c6edf9200a9" gracePeriod=30 Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.167710 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f6b49871-d3a3-4846-9c5d-3df7b920a420","Type":"ContainerStarted","Data":"31bc16ef8c1b14b656d54dd647e33215a7d9cb32c0e92e7c7c74bc0384b648bd"} Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.167742 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.186590 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-69bf8b4fb-nfx4x"] Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.196291 4937 scope.go:117] "RemoveContainer" containerID="17dcd3c571d10ecb5fbeba2d96fa4f01b7d503a85b38e859b46e71d755d087e5" Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.209079 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-69bf8b4fb-nfx4x"] Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.214737 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.214717083 podStartE2EDuration="5.214717083s" podCreationTimestamp="2026-02-25 16:13:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:25.209955184 +0000 UTC m=+1656.223347074" watchObservedRunningTime="2026-02-25 16:13:25.214717083 +0000 UTC m=+1656.228108963" Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.215771 4937 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3105ae01-7d47-4714-90ae-d17157c01e91" podUID="a03635e4-24a3-460b-ab0e-e3f677ac95c5" Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.384215 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3105ae01-7d47-4714-90ae-d17157c01e91" path="/var/lib/kubelet/pods/3105ae01-7d47-4714-90ae-d17157c01e91/volumes" Feb 25 16:13:25 crc kubenswrapper[4937]: I0225 16:13:25.384888 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" path="/var/lib/kubelet/pods/8ba5074b-5bcc-4028-9fe9-7ea203f32b25/volumes" Feb 25 16:13:26 crc kubenswrapper[4937]: I0225 16:13:26.219338 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2577b339-c9c0-4e63-afa1-c0b2fb7177b4","Type":"ContainerStarted","Data":"73834f0d72d100f7acfc9c3e1aaa128281b7a8d3463254ff25e22c106583027d"} Feb 25 16:13:26 crc kubenswrapper[4937]: I0225 16:13:26.220166 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 25 16:13:26 crc kubenswrapper[4937]: I0225 16:13:26.245287 4937 generic.go:334] "Generic (PLEG): container finished" podID="443f8b26-eac8-403e-b311-07cf1cd7cb83" containerID="03ecb7f01fc0cd1c9e12aaa6747d29e34de1ecbc9b48b2ce60eec63988b7f37e" exitCode=143 Feb 25 16:13:26 crc kubenswrapper[4937]: I0225 16:13:26.245387 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bc476bcbd-vwgwx" event={"ID":"443f8b26-eac8-403e-b311-07cf1cd7cb83","Type":"ContainerDied","Data":"03ecb7f01fc0cd1c9e12aaa6747d29e34de1ecbc9b48b2ce60eec63988b7f37e"} Feb 25 16:13:26 crc kubenswrapper[4937]: I0225 16:13:26.260050 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.260030348 podStartE2EDuration="3.260030348s" podCreationTimestamp="2026-02-25 16:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:26.240669893 +0000 UTC m=+1657.254061793" watchObservedRunningTime="2026-02-25 16:13:26.260030348 +0000 UTC m=+1657.273422238" Feb 25 16:13:26 crc kubenswrapper[4937]: I0225 16:13:26.333095 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 25 16:13:27 crc kubenswrapper[4937]: I0225 16:13:27.029212 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-754k4" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="registry-server" probeResult="failure" output=< Feb 25 16:13:27 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 16:13:27 crc kubenswrapper[4937]: > Feb 25 16:13:27 crc kubenswrapper[4937]: I0225 16:13:27.503832 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:13:27 crc kubenswrapper[4937]: I0225 16:13:27.618697 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-8ph6z"] Feb 25 16:13:27 crc kubenswrapper[4937]: I0225 16:13:27.618925 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" podUID="2d9a76a7-a730-4436-956e-d43596599433" containerName="dnsmasq-dns" containerID="cri-o://dac55af911733a13003687c9df498f486d492219add91cedc53aaadb23c0ef74" gracePeriod=10 Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.274471 4937 generic.go:334] "Generic (PLEG): container finished" podID="2d9a76a7-a730-4436-956e-d43596599433" containerID="dac55af911733a13003687c9df498f486d492219add91cedc53aaadb23c0ef74" exitCode=0 Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.274875 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" event={"ID":"2d9a76a7-a730-4436-956e-d43596599433","Type":"ContainerDied","Data":"dac55af911733a13003687c9df498f486d492219add91cedc53aaadb23c0ef74"} Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.393533 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.467115 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjf6s\" (UniqueName: \"kubernetes.io/projected/2d9a76a7-a730-4436-956e-d43596599433-kube-api-access-hjf6s\") pod \"2d9a76a7-a730-4436-956e-d43596599433\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.467211 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-ovsdbserver-nb\") pod \"2d9a76a7-a730-4436-956e-d43596599433\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.467404 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-dns-svc\") pod \"2d9a76a7-a730-4436-956e-d43596599433\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.467425 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-config\") pod \"2d9a76a7-a730-4436-956e-d43596599433\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.467461 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-ovsdbserver-sb\") pod \"2d9a76a7-a730-4436-956e-d43596599433\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.467511 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-dns-swift-storage-0\") pod \"2d9a76a7-a730-4436-956e-d43596599433\" (UID: \"2d9a76a7-a730-4436-956e-d43596599433\") " Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.496724 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d9a76a7-a730-4436-956e-d43596599433-kube-api-access-hjf6s" (OuterVolumeSpecName: "kube-api-access-hjf6s") pod "2d9a76a7-a730-4436-956e-d43596599433" (UID: "2d9a76a7-a730-4436-956e-d43596599433"). InnerVolumeSpecName "kube-api-access-hjf6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.554189 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2d9a76a7-a730-4436-956e-d43596599433" (UID: "2d9a76a7-a730-4436-956e-d43596599433"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.562085 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-config" (OuterVolumeSpecName: "config") pod "2d9a76a7-a730-4436-956e-d43596599433" (UID: "2d9a76a7-a730-4436-956e-d43596599433"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.564786 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2d9a76a7-a730-4436-956e-d43596599433" (UID: "2d9a76a7-a730-4436-956e-d43596599433"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.569270 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.569301 4937 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.569310 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjf6s\" (UniqueName: \"kubernetes.io/projected/2d9a76a7-a730-4436-956e-d43596599433-kube-api-access-hjf6s\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.569320 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.591080 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2d9a76a7-a730-4436-956e-d43596599433" (UID: "2d9a76a7-a730-4436-956e-d43596599433"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.647127 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2d9a76a7-a730-4436-956e-d43596599433" (UID: "2d9a76a7-a730-4436-956e-d43596599433"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.670532 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.670829 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d9a76a7-a730-4436-956e-d43596599433-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.888044 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.977146 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-scripts\") pod \"443f8b26-eac8-403e-b311-07cf1cd7cb83\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.977248 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-config-data\") pod \"443f8b26-eac8-403e-b311-07cf1cd7cb83\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.977274 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27wcd\" (UniqueName: \"kubernetes.io/projected/443f8b26-eac8-403e-b311-07cf1cd7cb83-kube-api-access-27wcd\") pod \"443f8b26-eac8-403e-b311-07cf1cd7cb83\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.977375 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-combined-ca-bundle\") pod \"443f8b26-eac8-403e-b311-07cf1cd7cb83\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.977418 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-internal-tls-certs\") pod \"443f8b26-eac8-403e-b311-07cf1cd7cb83\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.977536 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/443f8b26-eac8-403e-b311-07cf1cd7cb83-logs\") pod \"443f8b26-eac8-403e-b311-07cf1cd7cb83\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.977572 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-public-tls-certs\") pod \"443f8b26-eac8-403e-b311-07cf1cd7cb83\" (UID: \"443f8b26-eac8-403e-b311-07cf1cd7cb83\") " Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.989912 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/443f8b26-eac8-403e-b311-07cf1cd7cb83-logs" (OuterVolumeSpecName: "logs") pod "443f8b26-eac8-403e-b311-07cf1cd7cb83" (UID: "443f8b26-eac8-403e-b311-07cf1cd7cb83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:28 crc kubenswrapper[4937]: I0225 16:13:28.991433 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/443f8b26-eac8-403e-b311-07cf1cd7cb83-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.016460 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/443f8b26-eac8-403e-b311-07cf1cd7cb83-kube-api-access-27wcd" (OuterVolumeSpecName: "kube-api-access-27wcd") pod "443f8b26-eac8-403e-b311-07cf1cd7cb83" (UID: "443f8b26-eac8-403e-b311-07cf1cd7cb83"). InnerVolumeSpecName "kube-api-access-27wcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.017721 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-scripts" (OuterVolumeSpecName: "scripts") pod "443f8b26-eac8-403e-b311-07cf1cd7cb83" (UID: "443f8b26-eac8-403e-b311-07cf1cd7cb83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.104121 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.104420 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27wcd\" (UniqueName: \"kubernetes.io/projected/443f8b26-eac8-403e-b311-07cf1cd7cb83-kube-api-access-27wcd\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.129982 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "443f8b26-eac8-403e-b311-07cf1cd7cb83" (UID: "443f8b26-eac8-403e-b311-07cf1cd7cb83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.206825 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.227674 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-config-data" (OuterVolumeSpecName: "config-data") pod "443f8b26-eac8-403e-b311-07cf1cd7cb83" (UID: "443f8b26-eac8-403e-b311-07cf1cd7cb83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.230083 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "443f8b26-eac8-403e-b311-07cf1cd7cb83" (UID: "443f8b26-eac8-403e-b311-07cf1cd7cb83"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.236218 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "443f8b26-eac8-403e-b311-07cf1cd7cb83" (UID: "443f8b26-eac8-403e-b311-07cf1cd7cb83"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.308880 4937 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.308909 4937 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.308920 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/443f8b26-eac8-403e-b311-07cf1cd7cb83-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.314316 4937 generic.go:334] "Generic (PLEG): container finished" podID="443f8b26-eac8-403e-b311-07cf1cd7cb83" containerID="78644883a09ab939d41e6f50c4dfff46be349cce6856355a8ab86ae21f7e9023" exitCode=0 Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.314404 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bc476bcbd-vwgwx" event={"ID":"443f8b26-eac8-403e-b311-07cf1cd7cb83","Type":"ContainerDied","Data":"78644883a09ab939d41e6f50c4dfff46be349cce6856355a8ab86ae21f7e9023"} Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.314445 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bc476bcbd-vwgwx" event={"ID":"443f8b26-eac8-403e-b311-07cf1cd7cb83","Type":"ContainerDied","Data":"c3b2bce77205dd9e6fdf6556261c229c82b64a5eb3ef54e20adec1688fd896cc"} Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.314462 4937 scope.go:117] "RemoveContainer" containerID="78644883a09ab939d41e6f50c4dfff46be349cce6856355a8ab86ae21f7e9023" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.314615 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bc476bcbd-vwgwx" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.339449 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" event={"ID":"2d9a76a7-a730-4436-956e-d43596599433","Type":"ContainerDied","Data":"f2f871d73458e61edf424529f66db7d7660497899f89390b2407be4fd8bf1029"} Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.339604 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-8ph6z" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.346853 4937 generic.go:334] "Generic (PLEG): container finished" podID="af6a9d4b-1995-4a50-bc72-83bbebe6c32b" containerID="7d23e1fca2bc309538bfa704762a60fbe8ace734ef7787d59bfc9c6edf9200a9" exitCode=0 Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.347066 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"af6a9d4b-1995-4a50-bc72-83bbebe6c32b","Type":"ContainerDied","Data":"7d23e1fca2bc309538bfa704762a60fbe8ace734ef7787d59bfc9c6edf9200a9"} Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.362816 4937 scope.go:117] "RemoveContainer" containerID="03ecb7f01fc0cd1c9e12aaa6747d29e34de1ecbc9b48b2ce60eec63988b7f37e" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.395358 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5bc476bcbd-vwgwx"] Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.395391 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5bc476bcbd-vwgwx"] Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.403534 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-8ph6z"] Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.411025 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-8ph6z"] Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.411958 4937 scope.go:117] "RemoveContainer" containerID="78644883a09ab939d41e6f50c4dfff46be349cce6856355a8ab86ae21f7e9023" Feb 25 16:13:29 crc kubenswrapper[4937]: E0225 16:13:29.412375 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78644883a09ab939d41e6f50c4dfff46be349cce6856355a8ab86ae21f7e9023\": container with ID starting with 78644883a09ab939d41e6f50c4dfff46be349cce6856355a8ab86ae21f7e9023 not found: ID does not exist" containerID="78644883a09ab939d41e6f50c4dfff46be349cce6856355a8ab86ae21f7e9023" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.412537 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78644883a09ab939d41e6f50c4dfff46be349cce6856355a8ab86ae21f7e9023"} err="failed to get container status \"78644883a09ab939d41e6f50c4dfff46be349cce6856355a8ab86ae21f7e9023\": rpc error: code = NotFound desc = could not find container \"78644883a09ab939d41e6f50c4dfff46be349cce6856355a8ab86ae21f7e9023\": container with ID starting with 78644883a09ab939d41e6f50c4dfff46be349cce6856355a8ab86ae21f7e9023 not found: ID does not exist" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.412630 4937 scope.go:117] "RemoveContainer" containerID="03ecb7f01fc0cd1c9e12aaa6747d29e34de1ecbc9b48b2ce60eec63988b7f37e" Feb 25 16:13:29 crc kubenswrapper[4937]: E0225 16:13:29.412943 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03ecb7f01fc0cd1c9e12aaa6747d29e34de1ecbc9b48b2ce60eec63988b7f37e\": container with ID starting with 03ecb7f01fc0cd1c9e12aaa6747d29e34de1ecbc9b48b2ce60eec63988b7f37e not found: ID does not exist" containerID="03ecb7f01fc0cd1c9e12aaa6747d29e34de1ecbc9b48b2ce60eec63988b7f37e" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.413024 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03ecb7f01fc0cd1c9e12aaa6747d29e34de1ecbc9b48b2ce60eec63988b7f37e"} err="failed to get container status \"03ecb7f01fc0cd1c9e12aaa6747d29e34de1ecbc9b48b2ce60eec63988b7f37e\": rpc error: code = NotFound desc = could not find container \"03ecb7f01fc0cd1c9e12aaa6747d29e34de1ecbc9b48b2ce60eec63988b7f37e\": container with ID starting with 03ecb7f01fc0cd1c9e12aaa6747d29e34de1ecbc9b48b2ce60eec63988b7f37e not found: ID does not exist" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.413115 4937 scope.go:117] "RemoveContainer" containerID="dac55af911733a13003687c9df498f486d492219add91cedc53aaadb23c0ef74" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.445610 4937 scope.go:117] "RemoveContainer" containerID="784fd775475f4891f5b4fd847b9f1778442d353bae9b82a46c3287cd29e4b0f7" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.597131 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.715616 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-config-data-custom\") pod \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.716032 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-config-data\") pod \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.716083 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-scripts\") pod \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.716100 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-combined-ca-bundle\") pod \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.716125 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvw8k\" (UniqueName: \"kubernetes.io/projected/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-kube-api-access-pvw8k\") pod \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.716197 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-certs\") pod \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\" (UID: \"af6a9d4b-1995-4a50-bc72-83bbebe6c32b\") " Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.719505 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "af6a9d4b-1995-4a50-bc72-83bbebe6c32b" (UID: "af6a9d4b-1995-4a50-bc72-83bbebe6c32b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.720476 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-certs" (OuterVolumeSpecName: "certs") pod "af6a9d4b-1995-4a50-bc72-83bbebe6c32b" (UID: "af6a9d4b-1995-4a50-bc72-83bbebe6c32b"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.722573 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-scripts" (OuterVolumeSpecName: "scripts") pod "af6a9d4b-1995-4a50-bc72-83bbebe6c32b" (UID: "af6a9d4b-1995-4a50-bc72-83bbebe6c32b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.727597 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-kube-api-access-pvw8k" (OuterVolumeSpecName: "kube-api-access-pvw8k") pod "af6a9d4b-1995-4a50-bc72-83bbebe6c32b" (UID: "af6a9d4b-1995-4a50-bc72-83bbebe6c32b"). InnerVolumeSpecName "kube-api-access-pvw8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.750696 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af6a9d4b-1995-4a50-bc72-83bbebe6c32b" (UID: "af6a9d4b-1995-4a50-bc72-83bbebe6c32b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.767639 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-config-data" (OuterVolumeSpecName: "config-data") pod "af6a9d4b-1995-4a50-bc72-83bbebe6c32b" (UID: "af6a9d4b-1995-4a50-bc72-83bbebe6c32b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.818279 4937 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.818313 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.818322 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.818330 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.818339 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvw8k\" (UniqueName: \"kubernetes.io/projected/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-kube-api-access-pvw8k\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:29 crc kubenswrapper[4937]: I0225 16:13:29.818349 4937 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/af6a9d4b-1995-4a50-bc72-83bbebe6c32b-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.131638 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-75675bb4d7-q28jd"] Feb 25 16:13:30 crc kubenswrapper[4937]: E0225 16:13:30.132028 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" containerName="barbican-api" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.132045 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" containerName="barbican-api" Feb 25 16:13:30 crc kubenswrapper[4937]: E0225 16:13:30.132063 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9a76a7-a730-4436-956e-d43596599433" containerName="init" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.132070 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9a76a7-a730-4436-956e-d43596599433" containerName="init" Feb 25 16:13:30 crc kubenswrapper[4937]: E0225 16:13:30.132081 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="443f8b26-eac8-403e-b311-07cf1cd7cb83" containerName="placement-api" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.132087 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="443f8b26-eac8-403e-b311-07cf1cd7cb83" containerName="placement-api" Feb 25 16:13:30 crc kubenswrapper[4937]: E0225 16:13:30.132101 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9a76a7-a730-4436-956e-d43596599433" containerName="dnsmasq-dns" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.132107 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9a76a7-a730-4436-956e-d43596599433" containerName="dnsmasq-dns" Feb 25 16:13:30 crc kubenswrapper[4937]: E0225 16:13:30.132121 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6a9d4b-1995-4a50-bc72-83bbebe6c32b" containerName="cloudkitty-proc" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.132128 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6a9d4b-1995-4a50-bc72-83bbebe6c32b" containerName="cloudkitty-proc" Feb 25 16:13:30 crc kubenswrapper[4937]: E0225 16:13:30.132139 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="443f8b26-eac8-403e-b311-07cf1cd7cb83" containerName="placement-log" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.132145 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="443f8b26-eac8-403e-b311-07cf1cd7cb83" containerName="placement-log" Feb 25 16:13:30 crc kubenswrapper[4937]: E0225 16:13:30.132164 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" containerName="barbican-api-log" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.132170 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" containerName="barbican-api-log" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.132338 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" containerName="barbican-api-log" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.132358 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="443f8b26-eac8-403e-b311-07cf1cd7cb83" containerName="placement-log" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.132371 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d9a76a7-a730-4436-956e-d43596599433" containerName="dnsmasq-dns" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.132379 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="443f8b26-eac8-403e-b311-07cf1cd7cb83" containerName="placement-api" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.132392 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6a9d4b-1995-4a50-bc72-83bbebe6c32b" containerName="cloudkitty-proc" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.132402 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba5074b-5bcc-4028-9fe9-7ea203f32b25" containerName="barbican-api" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.133478 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.135541 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.135729 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.135930 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.149740 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-75675bb4d7-q28jd"] Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.233586 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0193db4-c078-4d8c-8437-538da8d426d2-public-tls-certs\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.233668 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0193db4-c078-4d8c-8437-538da8d426d2-log-httpd\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.233718 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0193db4-c078-4d8c-8437-538da8d426d2-internal-tls-certs\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.233822 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzj4d\" (UniqueName: \"kubernetes.io/projected/c0193db4-c078-4d8c-8437-538da8d426d2-kube-api-access-gzj4d\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.233865 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0193db4-c078-4d8c-8437-538da8d426d2-run-httpd\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.233884 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c0193db4-c078-4d8c-8437-538da8d426d2-etc-swift\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.233914 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0193db4-c078-4d8c-8437-538da8d426d2-config-data\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.233967 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0193db4-c078-4d8c-8437-538da8d426d2-combined-ca-bundle\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.335957 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0193db4-c078-4d8c-8437-538da8d426d2-log-httpd\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.336046 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0193db4-c078-4d8c-8437-538da8d426d2-internal-tls-certs\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.336121 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzj4d\" (UniqueName: \"kubernetes.io/projected/c0193db4-c078-4d8c-8437-538da8d426d2-kube-api-access-gzj4d\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.336168 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0193db4-c078-4d8c-8437-538da8d426d2-run-httpd\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.336188 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c0193db4-c078-4d8c-8437-538da8d426d2-etc-swift\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.336208 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0193db4-c078-4d8c-8437-538da8d426d2-config-data\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.336237 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0193db4-c078-4d8c-8437-538da8d426d2-combined-ca-bundle\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.336260 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0193db4-c078-4d8c-8437-538da8d426d2-public-tls-certs\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.336453 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0193db4-c078-4d8c-8437-538da8d426d2-log-httpd\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.336703 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0193db4-c078-4d8c-8437-538da8d426d2-run-httpd\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.341188 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0193db4-c078-4d8c-8437-538da8d426d2-internal-tls-certs\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.341258 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0193db4-c078-4d8c-8437-538da8d426d2-config-data\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.354720 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0193db4-c078-4d8c-8437-538da8d426d2-combined-ca-bundle\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.357062 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzj4d\" (UniqueName: \"kubernetes.io/projected/c0193db4-c078-4d8c-8437-538da8d426d2-kube-api-access-gzj4d\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.358113 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0193db4-c078-4d8c-8437-538da8d426d2-public-tls-certs\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.358291 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c0193db4-c078-4d8c-8437-538da8d426d2-etc-swift\") pod \"swift-proxy-75675bb4d7-q28jd\" (UID: \"c0193db4-c078-4d8c-8437-538da8d426d2\") " pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.366105 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.366716 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"af6a9d4b-1995-4a50-bc72-83bbebe6c32b","Type":"ContainerDied","Data":"3ab1a2c8e1fc185a2ac153e6674ea551b0b3f9b194c210bc7cc7f4dcd1f990c3"} Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.367038 4937 scope.go:117] "RemoveContainer" containerID="7d23e1fca2bc309538bfa704762a60fbe8ace734ef7787d59bfc9c6edf9200a9" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.478538 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.489124 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.500051 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.519107 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.520276 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.523261 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.540131 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.641979 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4nqz\" (UniqueName: \"kubernetes.io/projected/1aaa1053-5b44-458d-aa42-a9804344d2e3-kube-api-access-z4nqz\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.642030 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.642047 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.642065 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1aaa1053-5b44-458d-aa42-a9804344d2e3-certs\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.642163 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-config-data\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.642244 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-scripts\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.743897 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-scripts\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.743940 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4nqz\" (UniqueName: \"kubernetes.io/projected/1aaa1053-5b44-458d-aa42-a9804344d2e3-kube-api-access-z4nqz\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.743963 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.743981 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.744000 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1aaa1053-5b44-458d-aa42-a9804344d2e3-certs\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.744083 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-config-data\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.749939 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-config-data\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.756688 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.762441 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.763012 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-scripts\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.772133 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4nqz\" (UniqueName: \"kubernetes.io/projected/1aaa1053-5b44-458d-aa42-a9804344d2e3-kube-api-access-z4nqz\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.772133 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1aaa1053-5b44-458d-aa42-a9804344d2e3-certs\") pod \"cloudkitty-proc-0\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.838045 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.838319 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="ceilometer-central-agent" containerID="cri-o://be08f64bb9075dd47090696dff9b7393cb09a71c381a4fb20ffd8f490a4757b4" gracePeriod=30 Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.838471 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="proxy-httpd" containerID="cri-o://fa1d5c93a79360e663af9ea5722c354b0efee7b8d01479c5d48d69f93bb4dc9b" gracePeriod=30 Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.838573 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="sg-core" containerID="cri-o://b755297cd1c0f0146497c9710ee4daaba508f5c278f65f587dba7b3a79f8473e" gracePeriod=30 Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.838612 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="ceilometer-notification-agent" containerID="cri-o://dc287f1e28ef0946a0a899f78118307dbe945bcabd976b94f3ab313d8b43fd20" gracePeriod=30 Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.843229 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 25 16:13:30 crc kubenswrapper[4937]: I0225 16:13:30.859177 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.101948 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-75675bb4d7-q28jd"] Feb 25 16:13:31 crc kubenswrapper[4937]: W0225 16:13:31.115667 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0193db4_c078_4d8c_8437_538da8d426d2.slice/crio-f5d670402e4097e1515b53b912f38047255c30511364585c25e31ff3203964d8 WatchSource:0}: Error finding container f5d670402e4097e1515b53b912f38047255c30511364585c25e31ff3203964d8: Status 404 returned error can't find the container with id f5d670402e4097e1515b53b912f38047255c30511364585c25e31ff3203964d8 Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.419021 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d9a76a7-a730-4436-956e-d43596599433" path="/var/lib/kubelet/pods/2d9a76a7-a730-4436-956e-d43596599433/volumes" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.419832 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="443f8b26-eac8-403e-b311-07cf1cd7cb83" path="/var/lib/kubelet/pods/443f8b26-eac8-403e-b311-07cf1cd7cb83/volumes" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.421187 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af6a9d4b-1995-4a50-bc72-83bbebe6c32b" path="/var/lib/kubelet/pods/af6a9d4b-1995-4a50-bc72-83bbebe6c32b/volumes" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.429815 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.433880 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75675bb4d7-q28jd" event={"ID":"c0193db4-c078-4d8c-8437-538da8d426d2","Type":"ContainerStarted","Data":"ac2bea1b4eada741645707be57b2acc7cc440d102a6aad93fa28e78d1b1cd154"} Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.433930 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75675bb4d7-q28jd" event={"ID":"c0193db4-c078-4d8c-8437-538da8d426d2","Type":"ContainerStarted","Data":"f5d670402e4097e1515b53b912f38047255c30511364585c25e31ff3203964d8"} Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.465420 4937 generic.go:334] "Generic (PLEG): container finished" podID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerID="fa1d5c93a79360e663af9ea5722c354b0efee7b8d01479c5d48d69f93bb4dc9b" exitCode=0 Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.465450 4937 generic.go:334] "Generic (PLEG): container finished" podID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerID="b755297cd1c0f0146497c9710ee4daaba508f5c278f65f587dba7b3a79f8473e" exitCode=2 Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.465459 4937 generic.go:334] "Generic (PLEG): container finished" podID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerID="be08f64bb9075dd47090696dff9b7393cb09a71c381a4fb20ffd8f490a4757b4" exitCode=0 Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.465477 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a2d638a-ffd8-4721-be54-eb6b911ffbf0","Type":"ContainerDied","Data":"fa1d5c93a79360e663af9ea5722c354b0efee7b8d01479c5d48d69f93bb4dc9b"} Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.465523 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a2d638a-ffd8-4721-be54-eb6b911ffbf0","Type":"ContainerDied","Data":"b755297cd1c0f0146497c9710ee4daaba508f5c278f65f587dba7b3a79f8473e"} Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.465534 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a2d638a-ffd8-4721-be54-eb6b911ffbf0","Type":"ContainerDied","Data":"be08f64bb9075dd47090696dff9b7393cb09a71c381a4fb20ffd8f490a4757b4"} Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.668856 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-ck6bq"] Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.676283 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ck6bq" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.691871 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ck6bq"] Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.780609 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wglbn\" (UniqueName: \"kubernetes.io/projected/dc7039f2-d86f-4915-92de-5eab8c16f281-kube-api-access-wglbn\") pod \"nova-api-db-create-ck6bq\" (UID: \"dc7039f2-d86f-4915-92de-5eab8c16f281\") " pod="openstack/nova-api-db-create-ck6bq" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.780685 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc7039f2-d86f-4915-92de-5eab8c16f281-operator-scripts\") pod \"nova-api-db-create-ck6bq\" (UID: \"dc7039f2-d86f-4915-92de-5eab8c16f281\") " pod="openstack/nova-api-db-create-ck6bq" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.856410 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-996c-account-create-update-dqbhk"] Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.859737 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-996c-account-create-update-dqbhk" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.863788 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.885332 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b4ba23-a361-40f5-9025-a80203afb802-operator-scripts\") pod \"nova-api-996c-account-create-update-dqbhk\" (UID: \"30b4ba23-a361-40f5-9025-a80203afb802\") " pod="openstack/nova-api-996c-account-create-update-dqbhk" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.885400 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kch68\" (UniqueName: \"kubernetes.io/projected/30b4ba23-a361-40f5-9025-a80203afb802-kube-api-access-kch68\") pod \"nova-api-996c-account-create-update-dqbhk\" (UID: \"30b4ba23-a361-40f5-9025-a80203afb802\") " pod="openstack/nova-api-996c-account-create-update-dqbhk" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.885477 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wglbn\" (UniqueName: \"kubernetes.io/projected/dc7039f2-d86f-4915-92de-5eab8c16f281-kube-api-access-wglbn\") pod \"nova-api-db-create-ck6bq\" (UID: \"dc7039f2-d86f-4915-92de-5eab8c16f281\") " pod="openstack/nova-api-db-create-ck6bq" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.885553 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc7039f2-d86f-4915-92de-5eab8c16f281-operator-scripts\") pod \"nova-api-db-create-ck6bq\" (UID: \"dc7039f2-d86f-4915-92de-5eab8c16f281\") " pod="openstack/nova-api-db-create-ck6bq" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.886226 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc7039f2-d86f-4915-92de-5eab8c16f281-operator-scripts\") pod \"nova-api-db-create-ck6bq\" (UID: \"dc7039f2-d86f-4915-92de-5eab8c16f281\") " pod="openstack/nova-api-db-create-ck6bq" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.900556 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-996c-account-create-update-dqbhk"] Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.966594 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-vhl9c"] Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.968144 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vhl9c" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.986468 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vhl9c"] Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.991760 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b4ba23-a361-40f5-9025-a80203afb802-operator-scripts\") pod \"nova-api-996c-account-create-update-dqbhk\" (UID: \"30b4ba23-a361-40f5-9025-a80203afb802\") " pod="openstack/nova-api-996c-account-create-update-dqbhk" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.991823 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75cx9\" (UniqueName: \"kubernetes.io/projected/aca08b6d-8b50-4471-8ae5-0c3b517ef2b3-kube-api-access-75cx9\") pod \"nova-cell0-db-create-vhl9c\" (UID: \"aca08b6d-8b50-4471-8ae5-0c3b517ef2b3\") " pod="openstack/nova-cell0-db-create-vhl9c" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.991860 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kch68\" (UniqueName: \"kubernetes.io/projected/30b4ba23-a361-40f5-9025-a80203afb802-kube-api-access-kch68\") pod \"nova-api-996c-account-create-update-dqbhk\" (UID: \"30b4ba23-a361-40f5-9025-a80203afb802\") " pod="openstack/nova-api-996c-account-create-update-dqbhk" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.991928 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aca08b6d-8b50-4471-8ae5-0c3b517ef2b3-operator-scripts\") pod \"nova-cell0-db-create-vhl9c\" (UID: \"aca08b6d-8b50-4471-8ae5-0c3b517ef2b3\") " pod="openstack/nova-cell0-db-create-vhl9c" Feb 25 16:13:31 crc kubenswrapper[4937]: I0225 16:13:31.992794 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b4ba23-a361-40f5-9025-a80203afb802-operator-scripts\") pod \"nova-api-996c-account-create-update-dqbhk\" (UID: \"30b4ba23-a361-40f5-9025-a80203afb802\") " pod="openstack/nova-api-996c-account-create-update-dqbhk" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.026149 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wglbn\" (UniqueName: \"kubernetes.io/projected/dc7039f2-d86f-4915-92de-5eab8c16f281-kube-api-access-wglbn\") pod \"nova-api-db-create-ck6bq\" (UID: \"dc7039f2-d86f-4915-92de-5eab8c16f281\") " pod="openstack/nova-api-db-create-ck6bq" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.049552 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-gp8qg"] Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.050928 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gp8qg" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.059990 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kch68\" (UniqueName: \"kubernetes.io/projected/30b4ba23-a361-40f5-9025-a80203afb802-kube-api-access-kch68\") pod \"nova-api-996c-account-create-update-dqbhk\" (UID: \"30b4ba23-a361-40f5-9025-a80203afb802\") " pod="openstack/nova-api-996c-account-create-update-dqbhk" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.072538 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c25c-account-create-update-pjkhj"] Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.074116 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c25c-account-create-update-pjkhj" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.079827 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.092819 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-gp8qg"] Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.094377 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d324c2c-8a6d-431b-92d0-b735158fd9fa-operator-scripts\") pod \"nova-cell1-db-create-gp8qg\" (UID: \"3d324c2c-8a6d-431b-92d0-b735158fd9fa\") " pod="openstack/nova-cell1-db-create-gp8qg" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.094431 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aca08b6d-8b50-4471-8ae5-0c3b517ef2b3-operator-scripts\") pod \"nova-cell0-db-create-vhl9c\" (UID: \"aca08b6d-8b50-4471-8ae5-0c3b517ef2b3\") " pod="openstack/nova-cell0-db-create-vhl9c" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.094526 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5f5588-4ffb-43f5-a891-0a61f46ab7af-operator-scripts\") pod \"nova-cell0-c25c-account-create-update-pjkhj\" (UID: \"0f5f5588-4ffb-43f5-a891-0a61f46ab7af\") " pod="openstack/nova-cell0-c25c-account-create-update-pjkhj" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.094557 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5txdh\" (UniqueName: \"kubernetes.io/projected/0f5f5588-4ffb-43f5-a891-0a61f46ab7af-kube-api-access-5txdh\") pod \"nova-cell0-c25c-account-create-update-pjkhj\" (UID: \"0f5f5588-4ffb-43f5-a891-0a61f46ab7af\") " pod="openstack/nova-cell0-c25c-account-create-update-pjkhj" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.094594 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75cx9\" (UniqueName: \"kubernetes.io/projected/aca08b6d-8b50-4471-8ae5-0c3b517ef2b3-kube-api-access-75cx9\") pod \"nova-cell0-db-create-vhl9c\" (UID: \"aca08b6d-8b50-4471-8ae5-0c3b517ef2b3\") " pod="openstack/nova-cell0-db-create-vhl9c" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.094613 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k89lx\" (UniqueName: \"kubernetes.io/projected/3d324c2c-8a6d-431b-92d0-b735158fd9fa-kube-api-access-k89lx\") pod \"nova-cell1-db-create-gp8qg\" (UID: \"3d324c2c-8a6d-431b-92d0-b735158fd9fa\") " pod="openstack/nova-cell1-db-create-gp8qg" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.095825 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aca08b6d-8b50-4471-8ae5-0c3b517ef2b3-operator-scripts\") pod \"nova-cell0-db-create-vhl9c\" (UID: \"aca08b6d-8b50-4471-8ae5-0c3b517ef2b3\") " pod="openstack/nova-cell0-db-create-vhl9c" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.118081 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c25c-account-create-update-pjkhj"] Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.138192 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75cx9\" (UniqueName: \"kubernetes.io/projected/aca08b6d-8b50-4471-8ae5-0c3b517ef2b3-kube-api-access-75cx9\") pod \"nova-cell0-db-create-vhl9c\" (UID: \"aca08b6d-8b50-4471-8ae5-0c3b517ef2b3\") " pod="openstack/nova-cell0-db-create-vhl9c" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.146405 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.197638 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k89lx\" (UniqueName: \"kubernetes.io/projected/3d324c2c-8a6d-431b-92d0-b735158fd9fa-kube-api-access-k89lx\") pod \"nova-cell1-db-create-gp8qg\" (UID: \"3d324c2c-8a6d-431b-92d0-b735158fd9fa\") " pod="openstack/nova-cell1-db-create-gp8qg" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.197741 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d324c2c-8a6d-431b-92d0-b735158fd9fa-operator-scripts\") pod \"nova-cell1-db-create-gp8qg\" (UID: \"3d324c2c-8a6d-431b-92d0-b735158fd9fa\") " pod="openstack/nova-cell1-db-create-gp8qg" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.197834 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5f5588-4ffb-43f5-a891-0a61f46ab7af-operator-scripts\") pod \"nova-cell0-c25c-account-create-update-pjkhj\" (UID: \"0f5f5588-4ffb-43f5-a891-0a61f46ab7af\") " pod="openstack/nova-cell0-c25c-account-create-update-pjkhj" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.197867 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5txdh\" (UniqueName: \"kubernetes.io/projected/0f5f5588-4ffb-43f5-a891-0a61f46ab7af-kube-api-access-5txdh\") pod \"nova-cell0-c25c-account-create-update-pjkhj\" (UID: \"0f5f5588-4ffb-43f5-a891-0a61f46ab7af\") " pod="openstack/nova-cell0-c25c-account-create-update-pjkhj" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.199693 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d324c2c-8a6d-431b-92d0-b735158fd9fa-operator-scripts\") pod \"nova-cell1-db-create-gp8qg\" (UID: \"3d324c2c-8a6d-431b-92d0-b735158fd9fa\") " pod="openstack/nova-cell1-db-create-gp8qg" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.200235 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5f5588-4ffb-43f5-a891-0a61f46ab7af-operator-scripts\") pod \"nova-cell0-c25c-account-create-update-pjkhj\" (UID: \"0f5f5588-4ffb-43f5-a891-0a61f46ab7af\") " pod="openstack/nova-cell0-c25c-account-create-update-pjkhj" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.223234 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k89lx\" (UniqueName: \"kubernetes.io/projected/3d324c2c-8a6d-431b-92d0-b735158fd9fa-kube-api-access-k89lx\") pod \"nova-cell1-db-create-gp8qg\" (UID: \"3d324c2c-8a6d-431b-92d0-b735158fd9fa\") " pod="openstack/nova-cell1-db-create-gp8qg" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.233460 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5txdh\" (UniqueName: \"kubernetes.io/projected/0f5f5588-4ffb-43f5-a891-0a61f46ab7af-kube-api-access-5txdh\") pod \"nova-cell0-c25c-account-create-update-pjkhj\" (UID: \"0f5f5588-4ffb-43f5-a891-0a61f46ab7af\") " pod="openstack/nova-cell0-c25c-account-create-update-pjkhj" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.256652 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-dd47-account-create-update-m7rq7"] Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.258230 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-dd47-account-create-update-m7rq7" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.262833 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-dd47-account-create-update-m7rq7"] Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.262994 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.312852 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ck6bq" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.343836 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-996c-account-create-update-dqbhk" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.409796 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zwmh\" (UniqueName: \"kubernetes.io/projected/70916290-1479-4642-b8c3-cd571d51ba42-kube-api-access-8zwmh\") pod \"nova-cell1-dd47-account-create-update-m7rq7\" (UID: \"70916290-1479-4642-b8c3-cd571d51ba42\") " pod="openstack/nova-cell1-dd47-account-create-update-m7rq7" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.411225 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vhl9c" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.411963 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70916290-1479-4642-b8c3-cd571d51ba42-operator-scripts\") pod \"nova-cell1-dd47-account-create-update-m7rq7\" (UID: \"70916290-1479-4642-b8c3-cd571d51ba42\") " pod="openstack/nova-cell1-dd47-account-create-update-m7rq7" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.431915 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gp8qg" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.446760 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c25c-account-create-update-pjkhj" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.516296 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zwmh\" (UniqueName: \"kubernetes.io/projected/70916290-1479-4642-b8c3-cd571d51ba42-kube-api-access-8zwmh\") pod \"nova-cell1-dd47-account-create-update-m7rq7\" (UID: \"70916290-1479-4642-b8c3-cd571d51ba42\") " pod="openstack/nova-cell1-dd47-account-create-update-m7rq7" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.516750 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70916290-1479-4642-b8c3-cd571d51ba42-operator-scripts\") pod \"nova-cell1-dd47-account-create-update-m7rq7\" (UID: \"70916290-1479-4642-b8c3-cd571d51ba42\") " pod="openstack/nova-cell1-dd47-account-create-update-m7rq7" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.519118 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70916290-1479-4642-b8c3-cd571d51ba42-operator-scripts\") pod \"nova-cell1-dd47-account-create-update-m7rq7\" (UID: \"70916290-1479-4642-b8c3-cd571d51ba42\") " pod="openstack/nova-cell1-dd47-account-create-update-m7rq7" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.547327 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zwmh\" (UniqueName: \"kubernetes.io/projected/70916290-1479-4642-b8c3-cd571d51ba42-kube-api-access-8zwmh\") pod \"nova-cell1-dd47-account-create-update-m7rq7\" (UID: \"70916290-1479-4642-b8c3-cd571d51ba42\") " pod="openstack/nova-cell1-dd47-account-create-update-m7rq7" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.553334 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75675bb4d7-q28jd" event={"ID":"c0193db4-c078-4d8c-8437-538da8d426d2","Type":"ContainerStarted","Data":"4479b73094c076aaf313351f6379841e9bf8477f5fe36ae696d1c43b1054393e"} Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.553611 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.553650 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.576356 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"1aaa1053-5b44-458d-aa42-a9804344d2e3","Type":"ContainerStarted","Data":"312d0b6618c2c81c82d5415d1326e0f815b0f6ac2bb96de77fd0afd3090e1701"} Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.576399 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"1aaa1053-5b44-458d-aa42-a9804344d2e3","Type":"ContainerStarted","Data":"8eca9ea59eee3866e4275d6f72f464108f930c75e50e567c6f08c006d00a3e48"} Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.590386 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-75675bb4d7-q28jd" podStartSLOduration=2.59036882 podStartE2EDuration="2.59036882s" podCreationTimestamp="2026-02-25 16:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:32.586867433 +0000 UTC m=+1663.600259323" watchObservedRunningTime="2026-02-25 16:13:32.59036882 +0000 UTC m=+1663.603760710" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.608019 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.6080012420000003 podStartE2EDuration="2.608001242s" podCreationTimestamp="2026-02-25 16:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:32.607735966 +0000 UTC m=+1663.621127856" watchObservedRunningTime="2026-02-25 16:13:32.608001242 +0000 UTC m=+1663.621393132" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.756271 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-dd47-account-create-update-m7rq7" Feb 25 16:13:32 crc kubenswrapper[4937]: I0225 16:13:32.886170 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ck6bq"] Feb 25 16:13:32 crc kubenswrapper[4937]: W0225 16:13:32.893319 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc7039f2_d86f_4915_92de_5eab8c16f281.slice/crio-a8cf7cf9cf5a2df4e94afc83ec17cb83b56499aef52281fd6070068c1f305201 WatchSource:0}: Error finding container a8cf7cf9cf5a2df4e94afc83ec17cb83b56499aef52281fd6070068c1f305201: Status 404 returned error can't find the container with id a8cf7cf9cf5a2df4e94afc83ec17cb83b56499aef52281fd6070068c1f305201 Feb 25 16:13:33 crc kubenswrapper[4937]: I0225 16:13:33.518833 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vhl9c"] Feb 25 16:13:33 crc kubenswrapper[4937]: I0225 16:13:33.534646 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-gp8qg"] Feb 25 16:13:33 crc kubenswrapper[4937]: W0225 16:13:33.542799 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaca08b6d_8b50_4471_8ae5_0c3b517ef2b3.slice/crio-81b8c405ee85bc4a23a7f597082bdae0f3ddb0b189e75da025f3cfd0841086b4 WatchSource:0}: Error finding container 81b8c405ee85bc4a23a7f597082bdae0f3ddb0b189e75da025f3cfd0841086b4: Status 404 returned error can't find the container with id 81b8c405ee85bc4a23a7f597082bdae0f3ddb0b189e75da025f3cfd0841086b4 Feb 25 16:13:33 crc kubenswrapper[4937]: I0225 16:13:33.556134 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c25c-account-create-update-pjkhj"] Feb 25 16:13:33 crc kubenswrapper[4937]: I0225 16:13:33.565148 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-996c-account-create-update-dqbhk"] Feb 25 16:13:33 crc kubenswrapper[4937]: I0225 16:13:33.614568 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-dd47-account-create-update-m7rq7"] Feb 25 16:13:33 crc kubenswrapper[4937]: I0225 16:13:33.625695 4937 generic.go:334] "Generic (PLEG): container finished" podID="dc7039f2-d86f-4915-92de-5eab8c16f281" containerID="17fe8c26e8a7d51a9c951e2df23ea4327813d2108f24fe58b6c7693b73ddab73" exitCode=0 Feb 25 16:13:33 crc kubenswrapper[4937]: I0225 16:13:33.625785 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ck6bq" event={"ID":"dc7039f2-d86f-4915-92de-5eab8c16f281","Type":"ContainerDied","Data":"17fe8c26e8a7d51a9c951e2df23ea4327813d2108f24fe58b6c7693b73ddab73"} Feb 25 16:13:33 crc kubenswrapper[4937]: I0225 16:13:33.625813 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ck6bq" event={"ID":"dc7039f2-d86f-4915-92de-5eab8c16f281","Type":"ContainerStarted","Data":"a8cf7cf9cf5a2df4e94afc83ec17cb83b56499aef52281fd6070068c1f305201"} Feb 25 16:13:33 crc kubenswrapper[4937]: I0225 16:13:33.643379 4937 generic.go:334] "Generic (PLEG): container finished" podID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerID="dc287f1e28ef0946a0a899f78118307dbe945bcabd976b94f3ab313d8b43fd20" exitCode=0 Feb 25 16:13:33 crc kubenswrapper[4937]: I0225 16:13:33.643464 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a2d638a-ffd8-4721-be54-eb6b911ffbf0","Type":"ContainerDied","Data":"dc287f1e28ef0946a0a899f78118307dbe945bcabd976b94f3ab313d8b43fd20"} Feb 25 16:13:33 crc kubenswrapper[4937]: I0225 16:13:33.650866 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gp8qg" event={"ID":"3d324c2c-8a6d-431b-92d0-b735158fd9fa","Type":"ContainerStarted","Data":"c45c9d418601e0d34ccafd96390d8a3dab04bddc18b22100a32eccc5e88abef0"} Feb 25 16:13:33 crc kubenswrapper[4937]: I0225 16:13:33.661755 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vhl9c" event={"ID":"aca08b6d-8b50-4471-8ae5-0c3b517ef2b3","Type":"ContainerStarted","Data":"81b8c405ee85bc4a23a7f597082bdae0f3ddb0b189e75da025f3cfd0841086b4"} Feb 25 16:13:33 crc kubenswrapper[4937]: I0225 16:13:33.666348 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c25c-account-create-update-pjkhj" event={"ID":"0f5f5588-4ffb-43f5-a891-0a61f46ab7af","Type":"ContainerStarted","Data":"d4dee5ed8396e649dfabf8a03d45c2d3599eb8f6dc1306aed701f347331abde0"} Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.126899 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.265311 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-combined-ca-bundle\") pod \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.265403 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rblq\" (UniqueName: \"kubernetes.io/projected/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-kube-api-access-6rblq\") pod \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.265465 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-run-httpd\") pod \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.265554 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-log-httpd\") pod \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.265601 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-scripts\") pod \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.265651 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-sg-core-conf-yaml\") pod \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.265720 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-config-data\") pod \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\" (UID: \"0a2d638a-ffd8-4721-be54-eb6b911ffbf0\") " Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.267196 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0a2d638a-ffd8-4721-be54-eb6b911ffbf0" (UID: "0a2d638a-ffd8-4721-be54-eb6b911ffbf0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.271310 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0a2d638a-ffd8-4721-be54-eb6b911ffbf0" (UID: "0a2d638a-ffd8-4721-be54-eb6b911ffbf0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.277212 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-kube-api-access-6rblq" (OuterVolumeSpecName: "kube-api-access-6rblq") pod "0a2d638a-ffd8-4721-be54-eb6b911ffbf0" (UID: "0a2d638a-ffd8-4721-be54-eb6b911ffbf0"). InnerVolumeSpecName "kube-api-access-6rblq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.292000 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-scripts" (OuterVolumeSpecName: "scripts") pod "0a2d638a-ffd8-4721-be54-eb6b911ffbf0" (UID: "0a2d638a-ffd8-4721-be54-eb6b911ffbf0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.374095 4937 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.374154 4937 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.374190 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.374204 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rblq\" (UniqueName: \"kubernetes.io/projected/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-kube-api-access-6rblq\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.489763 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0a2d638a-ffd8-4721-be54-eb6b911ffbf0" (UID: "0a2d638a-ffd8-4721-be54-eb6b911ffbf0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.579734 4937 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.586335 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a2d638a-ffd8-4721-be54-eb6b911ffbf0" (UID: "0a2d638a-ffd8-4721-be54-eb6b911ffbf0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.670087 4937 scope.go:117] "RemoveContainer" containerID="79448ac60046cd6dbf14cd22e9c5ed8e94fbf29fd6cb045ed5751281c9ff6629" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.678710 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-config-data" (OuterVolumeSpecName: "config-data") pod "0a2d638a-ffd8-4721-be54-eb6b911ffbf0" (UID: "0a2d638a-ffd8-4721-be54-eb6b911ffbf0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.681460 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.681495 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a2d638a-ffd8-4721-be54-eb6b911ffbf0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.699500 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a2d638a-ffd8-4721-be54-eb6b911ffbf0","Type":"ContainerDied","Data":"a13a4896360173233a547fd6c2c57b3891482c6cfb7e6a580e77b7285b50aa5e"} Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.699552 4937 scope.go:117] "RemoveContainer" containerID="fa1d5c93a79360e663af9ea5722c354b0efee7b8d01479c5d48d69f93bb4dc9b" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.699692 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.727240 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gp8qg" event={"ID":"3d324c2c-8a6d-431b-92d0-b735158fd9fa","Type":"ContainerStarted","Data":"df8cb711efe73603c82bd5bba6a3b14feecdc2e93753b56ee494be611bfbbc78"} Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.748664 4937 generic.go:334] "Generic (PLEG): container finished" podID="aca08b6d-8b50-4471-8ae5-0c3b517ef2b3" containerID="f395a5b1e304c3bb44e2b84ed50e0c9ed2ac844e1a470dc98f6a712d508cda10" exitCode=0 Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.748827 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vhl9c" event={"ID":"aca08b6d-8b50-4471-8ae5-0c3b517ef2b3","Type":"ContainerDied","Data":"f395a5b1e304c3bb44e2b84ed50e0c9ed2ac844e1a470dc98f6a712d508cda10"} Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.768961 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-dd47-account-create-update-m7rq7" event={"ID":"70916290-1479-4642-b8c3-cd571d51ba42","Type":"ContainerStarted","Data":"77432d56416122942e2324d15dfdcdb0a8a8399a29c6d11a876e4b1f973dbf7d"} Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.768999 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-dd47-account-create-update-m7rq7" event={"ID":"70916290-1479-4642-b8c3-cd571d51ba42","Type":"ContainerStarted","Data":"deb6ca490af0f73d8fb08908e5ee94ed97e9ad8a07e6dd26b5d43553177c7295"} Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.805030 4937 generic.go:334] "Generic (PLEG): container finished" podID="0f5f5588-4ffb-43f5-a891-0a61f46ab7af" containerID="db1e85c0e17d1bfd820f6c49914c2c6689c8bc86c7d87db309f01c55431a9af9" exitCode=0 Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.805105 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c25c-account-create-update-pjkhj" event={"ID":"0f5f5588-4ffb-43f5-a891-0a61f46ab7af","Type":"ContainerDied","Data":"db1e85c0e17d1bfd820f6c49914c2c6689c8bc86c7d87db309f01c55431a9af9"} Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.844494 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-996c-account-create-update-dqbhk" event={"ID":"30b4ba23-a361-40f5-9025-a80203afb802","Type":"ContainerStarted","Data":"35f2454e80988e6d3382331eb99b5326ab08f19cd000e35acdf3a10e7b3e808b"} Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.844548 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-996c-account-create-update-dqbhk" event={"ID":"30b4ba23-a361-40f5-9025-a80203afb802","Type":"ContainerStarted","Data":"e96f59d14320c2cfebd88eb939b531da4629b9226d404d4a5a37b0fd4c7bc61d"} Feb 25 16:13:34 crc kubenswrapper[4937]: I0225 16:13:34.894843 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-996c-account-create-update-dqbhk" podStartSLOduration=3.8948256470000002 podStartE2EDuration="3.894825647s" podCreationTimestamp="2026-02-25 16:13:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:34.864844666 +0000 UTC m=+1665.878236556" watchObservedRunningTime="2026-02-25 16:13:34.894825647 +0000 UTC m=+1665.908217537" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.046715 4937 scope.go:117] "RemoveContainer" containerID="b755297cd1c0f0146497c9710ee4daaba508f5c278f65f587dba7b3a79f8473e" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.046766 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.082140 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.108812 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:35 crc kubenswrapper[4937]: E0225 16:13:35.109252 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="ceilometer-central-agent" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.109267 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="ceilometer-central-agent" Feb 25 16:13:35 crc kubenswrapper[4937]: E0225 16:13:35.109301 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="ceilometer-notification-agent" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.109307 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="ceilometer-notification-agent" Feb 25 16:13:35 crc kubenswrapper[4937]: E0225 16:13:35.109317 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="sg-core" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.109323 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="sg-core" Feb 25 16:13:35 crc kubenswrapper[4937]: E0225 16:13:35.109330 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="proxy-httpd" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.109335 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="proxy-httpd" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.109535 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="ceilometer-notification-agent" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.109555 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="proxy-httpd" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.109571 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="sg-core" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.109579 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" containerName="ceilometer-central-agent" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.111385 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.114054 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.120328 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.145233 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.150813 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.150866 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438c5ff0-7419-49e9-848e-bbdf17319694-log-httpd\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.150907 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.150966 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v6s6\" (UniqueName: \"kubernetes.io/projected/438c5ff0-7419-49e9-848e-bbdf17319694-kube-api-access-6v6s6\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.150988 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438c5ff0-7419-49e9-848e-bbdf17319694-run-httpd\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.151003 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-config-data\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.151016 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-scripts\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.200738 4937 scope.go:117] "RemoveContainer" containerID="dc287f1e28ef0946a0a899f78118307dbe945bcabd976b94f3ab313d8b43fd20" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.254970 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-scripts\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.255170 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.255226 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438c5ff0-7419-49e9-848e-bbdf17319694-log-httpd\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.255296 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.255361 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v6s6\" (UniqueName: \"kubernetes.io/projected/438c5ff0-7419-49e9-848e-bbdf17319694-kube-api-access-6v6s6\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.255400 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438c5ff0-7419-49e9-848e-bbdf17319694-run-httpd\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.255430 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-config-data\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.256835 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438c5ff0-7419-49e9-848e-bbdf17319694-log-httpd\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.262447 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438c5ff0-7419-49e9-848e-bbdf17319694-run-httpd\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.280573 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-scripts\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.281905 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-config-data\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.284140 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.289569 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.304522 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v6s6\" (UniqueName: \"kubernetes.io/projected/438c5ff0-7419-49e9-848e-bbdf17319694-kube-api-access-6v6s6\") pod \"ceilometer-0\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.342699 4937 scope.go:117] "RemoveContainer" containerID="be08f64bb9075dd47090696dff9b7393cb09a71c381a4fb20ffd8f490a4757b4" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.432048 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.453717 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a2d638a-ffd8-4721-be54-eb6b911ffbf0" path="/var/lib/kubelet/pods/0a2d638a-ffd8-4721-be54-eb6b911ffbf0/volumes" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.632982 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ck6bq" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.780993 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wglbn\" (UniqueName: \"kubernetes.io/projected/dc7039f2-d86f-4915-92de-5eab8c16f281-kube-api-access-wglbn\") pod \"dc7039f2-d86f-4915-92de-5eab8c16f281\" (UID: \"dc7039f2-d86f-4915-92de-5eab8c16f281\") " Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.781570 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc7039f2-d86f-4915-92de-5eab8c16f281-operator-scripts\") pod \"dc7039f2-d86f-4915-92de-5eab8c16f281\" (UID: \"dc7039f2-d86f-4915-92de-5eab8c16f281\") " Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.781866 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc7039f2-d86f-4915-92de-5eab8c16f281-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc7039f2-d86f-4915-92de-5eab8c16f281" (UID: "dc7039f2-d86f-4915-92de-5eab8c16f281"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.782347 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc7039f2-d86f-4915-92de-5eab8c16f281-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.788240 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc7039f2-d86f-4915-92de-5eab8c16f281-kube-api-access-wglbn" (OuterVolumeSpecName: "kube-api-access-wglbn") pod "dc7039f2-d86f-4915-92de-5eab8c16f281" (UID: "dc7039f2-d86f-4915-92de-5eab8c16f281"). InnerVolumeSpecName "kube-api-access-wglbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.874642 4937 generic.go:334] "Generic (PLEG): container finished" podID="3d324c2c-8a6d-431b-92d0-b735158fd9fa" containerID="df8cb711efe73603c82bd5bba6a3b14feecdc2e93753b56ee494be611bfbbc78" exitCode=0 Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.874726 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gp8qg" event={"ID":"3d324c2c-8a6d-431b-92d0-b735158fd9fa","Type":"ContainerDied","Data":"df8cb711efe73603c82bd5bba6a3b14feecdc2e93753b56ee494be611bfbbc78"} Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.889970 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wglbn\" (UniqueName: \"kubernetes.io/projected/dc7039f2-d86f-4915-92de-5eab8c16f281-kube-api-access-wglbn\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.900375 4937 generic.go:334] "Generic (PLEG): container finished" podID="70916290-1479-4642-b8c3-cd571d51ba42" containerID="77432d56416122942e2324d15dfdcdb0a8a8399a29c6d11a876e4b1f973dbf7d" exitCode=0 Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.900721 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-dd47-account-create-update-m7rq7" event={"ID":"70916290-1479-4642-b8c3-cd571d51ba42","Type":"ContainerDied","Data":"77432d56416122942e2324d15dfdcdb0a8a8399a29c6d11a876e4b1f973dbf7d"} Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.944775 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ck6bq" event={"ID":"dc7039f2-d86f-4915-92de-5eab8c16f281","Type":"ContainerDied","Data":"a8cf7cf9cf5a2df4e94afc83ec17cb83b56499aef52281fd6070068c1f305201"} Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.944808 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8cf7cf9cf5a2df4e94afc83ec17cb83b56499aef52281fd6070068c1f305201" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.944875 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ck6bq" Feb 25 16:13:35 crc kubenswrapper[4937]: I0225 16:13:35.996461 4937 generic.go:334] "Generic (PLEG): container finished" podID="30b4ba23-a361-40f5-9025-a80203afb802" containerID="35f2454e80988e6d3382331eb99b5326ab08f19cd000e35acdf3a10e7b3e808b" exitCode=0 Feb 25 16:13:36 crc kubenswrapper[4937]: I0225 16:13:36.002529 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-996c-account-create-update-dqbhk" event={"ID":"30b4ba23-a361-40f5-9025-a80203afb802","Type":"ContainerDied","Data":"35f2454e80988e6d3382331eb99b5326ab08f19cd000e35acdf3a10e7b3e808b"} Feb 25 16:13:36 crc kubenswrapper[4937]: I0225 16:13:36.637165 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-dd47-account-create-update-m7rq7" Feb 25 16:13:36 crc kubenswrapper[4937]: I0225 16:13:36.776808 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:36 crc kubenswrapper[4937]: I0225 16:13:36.821652 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70916290-1479-4642-b8c3-cd571d51ba42-operator-scripts\") pod \"70916290-1479-4642-b8c3-cd571d51ba42\" (UID: \"70916290-1479-4642-b8c3-cd571d51ba42\") " Feb 25 16:13:36 crc kubenswrapper[4937]: I0225 16:13:36.821831 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zwmh\" (UniqueName: \"kubernetes.io/projected/70916290-1479-4642-b8c3-cd571d51ba42-kube-api-access-8zwmh\") pod \"70916290-1479-4642-b8c3-cd571d51ba42\" (UID: \"70916290-1479-4642-b8c3-cd571d51ba42\") " Feb 25 16:13:36 crc kubenswrapper[4937]: I0225 16:13:36.823101 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70916290-1479-4642-b8c3-cd571d51ba42-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "70916290-1479-4642-b8c3-cd571d51ba42" (UID: "70916290-1479-4642-b8c3-cd571d51ba42"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:36 crc kubenswrapper[4937]: I0225 16:13:36.833212 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70916290-1479-4642-b8c3-cd571d51ba42-kube-api-access-8zwmh" (OuterVolumeSpecName: "kube-api-access-8zwmh") pod "70916290-1479-4642-b8c3-cd571d51ba42" (UID: "70916290-1479-4642-b8c3-cd571d51ba42"). InnerVolumeSpecName "kube-api-access-8zwmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:36 crc kubenswrapper[4937]: I0225 16:13:36.841600 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-57ff6d8577-ntrmb" Feb 25 16:13:36 crc kubenswrapper[4937]: I0225 16:13:36.919664 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f86fff94d-29bhj"] Feb 25 16:13:36 crc kubenswrapper[4937]: I0225 16:13:36.919939 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f86fff94d-29bhj" podUID="00113672-6314-4271-b571-682e18b9a920" containerName="neutron-httpd" containerID="cri-o://9588a5557ba29747508d1d9d1cd9fa4955e2495838a9f652499423a47557491d" gracePeriod=30 Feb 25 16:13:36 crc kubenswrapper[4937]: I0225 16:13:36.920244 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f86fff94d-29bhj" podUID="00113672-6314-4271-b571-682e18b9a920" containerName="neutron-api" containerID="cri-o://b8afa98b9fa3cd878e6f97dd5d597e90939e2b1f892fd059bb1d43f89758f3f6" gracePeriod=30 Feb 25 16:13:36 crc kubenswrapper[4937]: I0225 16:13:36.924723 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zwmh\" (UniqueName: \"kubernetes.io/projected/70916290-1479-4642-b8c3-cd571d51ba42-kube-api-access-8zwmh\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:36 crc kubenswrapper[4937]: I0225 16:13:36.924751 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70916290-1479-4642-b8c3-cd571d51ba42-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:36 crc kubenswrapper[4937]: I0225 16:13:36.984093 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c25c-account-create-update-pjkhj" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.007230 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gp8qg" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.014607 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vhl9c" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.073099 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vhl9c" event={"ID":"aca08b6d-8b50-4471-8ae5-0c3b517ef2b3","Type":"ContainerDied","Data":"81b8c405ee85bc4a23a7f597082bdae0f3ddb0b189e75da025f3cfd0841086b4"} Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.073131 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81b8c405ee85bc4a23a7f597082bdae0f3ddb0b189e75da025f3cfd0841086b4" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.073219 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vhl9c" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.077603 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-dd47-account-create-update-m7rq7" event={"ID":"70916290-1479-4642-b8c3-cd571d51ba42","Type":"ContainerDied","Data":"deb6ca490af0f73d8fb08908e5ee94ed97e9ad8a07e6dd26b5d43553177c7295"} Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.077635 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deb6ca490af0f73d8fb08908e5ee94ed97e9ad8a07e6dd26b5d43553177c7295" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.077651 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-dd47-account-create-update-m7rq7" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.080076 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-754k4" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="registry-server" probeResult="failure" output=< Feb 25 16:13:37 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 16:13:37 crc kubenswrapper[4937]: > Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.083763 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c25c-account-create-update-pjkhj" event={"ID":"0f5f5588-4ffb-43f5-a891-0a61f46ab7af","Type":"ContainerDied","Data":"d4dee5ed8396e649dfabf8a03d45c2d3599eb8f6dc1306aed701f347331abde0"} Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.083804 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4dee5ed8396e649dfabf8a03d45c2d3599eb8f6dc1306aed701f347331abde0" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.083924 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c25c-account-create-update-pjkhj" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.101197 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"438c5ff0-7419-49e9-848e-bbdf17319694","Type":"ContainerStarted","Data":"b046cd31a33afa50f931efda73c6947f6129daa72b5bab53dbe2c10bc8a4ada4"} Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.104191 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gp8qg" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.104926 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gp8qg" event={"ID":"3d324c2c-8a6d-431b-92d0-b735158fd9fa","Type":"ContainerDied","Data":"c45c9d418601e0d34ccafd96390d8a3dab04bddc18b22100a32eccc5e88abef0"} Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.104969 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c45c9d418601e0d34ccafd96390d8a3dab04bddc18b22100a32eccc5e88abef0" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.128889 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75cx9\" (UniqueName: \"kubernetes.io/projected/aca08b6d-8b50-4471-8ae5-0c3b517ef2b3-kube-api-access-75cx9\") pod \"aca08b6d-8b50-4471-8ae5-0c3b517ef2b3\" (UID: \"aca08b6d-8b50-4471-8ae5-0c3b517ef2b3\") " Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.128957 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5txdh\" (UniqueName: \"kubernetes.io/projected/0f5f5588-4ffb-43f5-a891-0a61f46ab7af-kube-api-access-5txdh\") pod \"0f5f5588-4ffb-43f5-a891-0a61f46ab7af\" (UID: \"0f5f5588-4ffb-43f5-a891-0a61f46ab7af\") " Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.129053 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5f5588-4ffb-43f5-a891-0a61f46ab7af-operator-scripts\") pod \"0f5f5588-4ffb-43f5-a891-0a61f46ab7af\" (UID: \"0f5f5588-4ffb-43f5-a891-0a61f46ab7af\") " Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.129235 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k89lx\" (UniqueName: \"kubernetes.io/projected/3d324c2c-8a6d-431b-92d0-b735158fd9fa-kube-api-access-k89lx\") pod \"3d324c2c-8a6d-431b-92d0-b735158fd9fa\" (UID: \"3d324c2c-8a6d-431b-92d0-b735158fd9fa\") " Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.129324 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d324c2c-8a6d-431b-92d0-b735158fd9fa-operator-scripts\") pod \"3d324c2c-8a6d-431b-92d0-b735158fd9fa\" (UID: \"3d324c2c-8a6d-431b-92d0-b735158fd9fa\") " Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.129419 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aca08b6d-8b50-4471-8ae5-0c3b517ef2b3-operator-scripts\") pod \"aca08b6d-8b50-4471-8ae5-0c3b517ef2b3\" (UID: \"aca08b6d-8b50-4471-8ae5-0c3b517ef2b3\") " Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.132987 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aca08b6d-8b50-4471-8ae5-0c3b517ef2b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aca08b6d-8b50-4471-8ae5-0c3b517ef2b3" (UID: "aca08b6d-8b50-4471-8ae5-0c3b517ef2b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.132980 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5f5588-4ffb-43f5-a891-0a61f46ab7af-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f5f5588-4ffb-43f5-a891-0a61f46ab7af" (UID: "0f5f5588-4ffb-43f5-a891-0a61f46ab7af"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.133469 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d324c2c-8a6d-431b-92d0-b735158fd9fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d324c2c-8a6d-431b-92d0-b735158fd9fa" (UID: "3d324c2c-8a6d-431b-92d0-b735158fd9fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.144876 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d324c2c-8a6d-431b-92d0-b735158fd9fa-kube-api-access-k89lx" (OuterVolumeSpecName: "kube-api-access-k89lx") pod "3d324c2c-8a6d-431b-92d0-b735158fd9fa" (UID: "3d324c2c-8a6d-431b-92d0-b735158fd9fa"). InnerVolumeSpecName "kube-api-access-k89lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.153712 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca08b6d-8b50-4471-8ae5-0c3b517ef2b3-kube-api-access-75cx9" (OuterVolumeSpecName: "kube-api-access-75cx9") pod "aca08b6d-8b50-4471-8ae5-0c3b517ef2b3" (UID: "aca08b6d-8b50-4471-8ae5-0c3b517ef2b3"). InnerVolumeSpecName "kube-api-access-75cx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.181744 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f5f5588-4ffb-43f5-a891-0a61f46ab7af-kube-api-access-5txdh" (OuterVolumeSpecName: "kube-api-access-5txdh") pod "0f5f5588-4ffb-43f5-a891-0a61f46ab7af" (UID: "0f5f5588-4ffb-43f5-a891-0a61f46ab7af"). InnerVolumeSpecName "kube-api-access-5txdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.235040 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5txdh\" (UniqueName: \"kubernetes.io/projected/0f5f5588-4ffb-43f5-a891-0a61f46ab7af-kube-api-access-5txdh\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.235083 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75cx9\" (UniqueName: \"kubernetes.io/projected/aca08b6d-8b50-4471-8ae5-0c3b517ef2b3-kube-api-access-75cx9\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.235096 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5f5588-4ffb-43f5-a891-0a61f46ab7af-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.235105 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k89lx\" (UniqueName: \"kubernetes.io/projected/3d324c2c-8a6d-431b-92d0-b735158fd9fa-kube-api-access-k89lx\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.235114 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d324c2c-8a6d-431b-92d0-b735158fd9fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.235122 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aca08b6d-8b50-4471-8ae5-0c3b517ef2b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.562011 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-996c-account-create-update-dqbhk" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.748627 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kch68\" (UniqueName: \"kubernetes.io/projected/30b4ba23-a361-40f5-9025-a80203afb802-kube-api-access-kch68\") pod \"30b4ba23-a361-40f5-9025-a80203afb802\" (UID: \"30b4ba23-a361-40f5-9025-a80203afb802\") " Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.748922 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b4ba23-a361-40f5-9025-a80203afb802-operator-scripts\") pod \"30b4ba23-a361-40f5-9025-a80203afb802\" (UID: \"30b4ba23-a361-40f5-9025-a80203afb802\") " Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.750328 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b4ba23-a361-40f5-9025-a80203afb802-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30b4ba23-a361-40f5-9025-a80203afb802" (UID: "30b4ba23-a361-40f5-9025-a80203afb802"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.752895 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b4ba23-a361-40f5-9025-a80203afb802-kube-api-access-kch68" (OuterVolumeSpecName: "kube-api-access-kch68") pod "30b4ba23-a361-40f5-9025-a80203afb802" (UID: "30b4ba23-a361-40f5-9025-a80203afb802"). InnerVolumeSpecName "kube-api-access-kch68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.851650 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kch68\" (UniqueName: \"kubernetes.io/projected/30b4ba23-a361-40f5-9025-a80203afb802-kube-api-access-kch68\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:37 crc kubenswrapper[4937]: I0225 16:13:37.851695 4937 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b4ba23-a361-40f5-9025-a80203afb802-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:38 crc kubenswrapper[4937]: I0225 16:13:38.130164 4937 generic.go:334] "Generic (PLEG): container finished" podID="00113672-6314-4271-b571-682e18b9a920" containerID="9588a5557ba29747508d1d9d1cd9fa4955e2495838a9f652499423a47557491d" exitCode=0 Feb 25 16:13:38 crc kubenswrapper[4937]: I0225 16:13:38.130303 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f86fff94d-29bhj" event={"ID":"00113672-6314-4271-b571-682e18b9a920","Type":"ContainerDied","Data":"9588a5557ba29747508d1d9d1cd9fa4955e2495838a9f652499423a47557491d"} Feb 25 16:13:38 crc kubenswrapper[4937]: I0225 16:13:38.134846 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-996c-account-create-update-dqbhk" event={"ID":"30b4ba23-a361-40f5-9025-a80203afb802","Type":"ContainerDied","Data":"e96f59d14320c2cfebd88eb939b531da4629b9226d404d4a5a37b0fd4c7bc61d"} Feb 25 16:13:38 crc kubenswrapper[4937]: I0225 16:13:38.134892 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e96f59d14320c2cfebd88eb939b531da4629b9226d404d4a5a37b0fd4c7bc61d" Feb 25 16:13:38 crc kubenswrapper[4937]: I0225 16:13:38.135000 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-996c-account-create-update-dqbhk" Feb 25 16:13:38 crc kubenswrapper[4937]: I0225 16:13:38.139506 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"438c5ff0-7419-49e9-848e-bbdf17319694","Type":"ContainerStarted","Data":"0dc67156cd194690fa7ce4eb52a640c8a52cff5bbc86265ed5f3e6b26a5ea247"} Feb 25 16:13:39 crc kubenswrapper[4937]: I0225 16:13:39.150776 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"438c5ff0-7419-49e9-848e-bbdf17319694","Type":"ContainerStarted","Data":"cbde4de279afd299e5f9e009f62ba9bd36720a5e8d214198e7957d4808a98920"} Feb 25 16:13:40 crc kubenswrapper[4937]: I0225 16:13:40.489849 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:40 crc kubenswrapper[4937]: I0225 16:13:40.498172 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-75675bb4d7-q28jd" Feb 25 16:13:41 crc kubenswrapper[4937]: I0225 16:13:41.177556 4937 generic.go:334] "Generic (PLEG): container finished" podID="00113672-6314-4271-b571-682e18b9a920" containerID="b8afa98b9fa3cd878e6f97dd5d597e90939e2b1f892fd059bb1d43f89758f3f6" exitCode=0 Feb 25 16:13:41 crc kubenswrapper[4937]: I0225 16:13:41.177833 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f86fff94d-29bhj" event={"ID":"00113672-6314-4271-b571-682e18b9a920","Type":"ContainerDied","Data":"b8afa98b9fa3cd878e6f97dd5d597e90939e2b1f892fd059bb1d43f89758f3f6"} Feb 25 16:13:41 crc kubenswrapper[4937]: I0225 16:13:41.179720 4937 generic.go:334] "Generic (PLEG): container finished" podID="da32b76e-0420-4e12-8b28-8a865a41d899" containerID="4ecadf33cfb0137001239dc2fa3d24886888be08c239db6e6a840a35347849c1" exitCode=137 Feb 25 16:13:41 crc kubenswrapper[4937]: I0225 16:13:41.179796 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da32b76e-0420-4e12-8b28-8a865a41d899","Type":"ContainerDied","Data":"4ecadf33cfb0137001239dc2fa3d24886888be08c239db6e6a840a35347849c1"} Feb 25 16:13:41 crc kubenswrapper[4937]: I0225 16:13:41.188000 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="da32b76e-0420-4e12-8b28-8a865a41d899" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.190:8776/healthcheck\": dial tcp 10.217.0.190:8776: connect: connection refused" Feb 25 16:13:41 crc kubenswrapper[4937]: I0225 16:13:41.495032 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:13:41 crc kubenswrapper[4937]: I0225 16:13:41.495095 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:13:41 crc kubenswrapper[4937]: I0225 16:13:41.495149 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 16:13:41 crc kubenswrapper[4937]: I0225 16:13:41.495920 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 16:13:41 crc kubenswrapper[4937]: I0225 16:13:41.495985 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" gracePeriod=600 Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.192606 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" exitCode=0 Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.192654 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e"} Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.192704 4937 scope.go:117] "RemoveContainer" containerID="710133016a8fda213d788ff3f0a0661f137f661d0c6764233454878cf67045e1" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.894432 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ln9pv"] Feb 25 16:13:42 crc kubenswrapper[4937]: E0225 16:13:42.904464 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca08b6d-8b50-4471-8ae5-0c3b517ef2b3" containerName="mariadb-database-create" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.904628 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca08b6d-8b50-4471-8ae5-0c3b517ef2b3" containerName="mariadb-database-create" Feb 25 16:13:42 crc kubenswrapper[4937]: E0225 16:13:42.904714 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d324c2c-8a6d-431b-92d0-b735158fd9fa" containerName="mariadb-database-create" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.904798 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d324c2c-8a6d-431b-92d0-b735158fd9fa" containerName="mariadb-database-create" Feb 25 16:13:42 crc kubenswrapper[4937]: E0225 16:13:42.904880 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7039f2-d86f-4915-92de-5eab8c16f281" containerName="mariadb-database-create" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.904928 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7039f2-d86f-4915-92de-5eab8c16f281" containerName="mariadb-database-create" Feb 25 16:13:42 crc kubenswrapper[4937]: E0225 16:13:42.905009 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b4ba23-a361-40f5-9025-a80203afb802" containerName="mariadb-account-create-update" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.905056 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b4ba23-a361-40f5-9025-a80203afb802" containerName="mariadb-account-create-update" Feb 25 16:13:42 crc kubenswrapper[4937]: E0225 16:13:42.905116 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70916290-1479-4642-b8c3-cd571d51ba42" containerName="mariadb-account-create-update" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.905171 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="70916290-1479-4642-b8c3-cd571d51ba42" containerName="mariadb-account-create-update" Feb 25 16:13:42 crc kubenswrapper[4937]: E0225 16:13:42.905237 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f5f5588-4ffb-43f5-a891-0a61f46ab7af" containerName="mariadb-account-create-update" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.905282 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f5f5588-4ffb-43f5-a891-0a61f46ab7af" containerName="mariadb-account-create-update" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.905920 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d324c2c-8a6d-431b-92d0-b735158fd9fa" containerName="mariadb-database-create" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.906007 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="70916290-1479-4642-b8c3-cd571d51ba42" containerName="mariadb-account-create-update" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.906085 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f5f5588-4ffb-43f5-a891-0a61f46ab7af" containerName="mariadb-account-create-update" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.906137 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b4ba23-a361-40f5-9025-a80203afb802" containerName="mariadb-account-create-update" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.906172 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca08b6d-8b50-4471-8ae5-0c3b517ef2b3" containerName="mariadb-database-create" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.906204 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7039f2-d86f-4915-92de-5eab8c16f281" containerName="mariadb-database-create" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.913060 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.918212 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ln9pv"] Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.919610 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-m7lvs" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.919848 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 25 16:13:42 crc kubenswrapper[4937]: I0225 16:13:42.921407 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 25 16:13:43 crc kubenswrapper[4937]: I0225 16:13:43.063117 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-scripts\") pod \"nova-cell0-conductor-db-sync-ln9pv\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:13:43 crc kubenswrapper[4937]: I0225 16:13:43.063206 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq7cf\" (UniqueName: \"kubernetes.io/projected/90c2ee96-d6a2-4231-abc4-e9e186375ede-kube-api-access-xq7cf\") pod \"nova-cell0-conductor-db-sync-ln9pv\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:13:43 crc kubenswrapper[4937]: I0225 16:13:43.063245 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-config-data\") pod \"nova-cell0-conductor-db-sync-ln9pv\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:13:43 crc kubenswrapper[4937]: I0225 16:13:43.063457 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ln9pv\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:13:43 crc kubenswrapper[4937]: I0225 16:13:43.165149 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-scripts\") pod \"nova-cell0-conductor-db-sync-ln9pv\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:13:43 crc kubenswrapper[4937]: I0225 16:13:43.165251 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq7cf\" (UniqueName: \"kubernetes.io/projected/90c2ee96-d6a2-4231-abc4-e9e186375ede-kube-api-access-xq7cf\") pod \"nova-cell0-conductor-db-sync-ln9pv\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:13:43 crc kubenswrapper[4937]: I0225 16:13:43.165300 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-config-data\") pod \"nova-cell0-conductor-db-sync-ln9pv\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:13:43 crc kubenswrapper[4937]: I0225 16:13:43.165362 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ln9pv\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:13:43 crc kubenswrapper[4937]: I0225 16:13:43.172109 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ln9pv\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:13:43 crc kubenswrapper[4937]: I0225 16:13:43.173953 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-scripts\") pod \"nova-cell0-conductor-db-sync-ln9pv\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:13:43 crc kubenswrapper[4937]: I0225 16:13:43.185095 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq7cf\" (UniqueName: \"kubernetes.io/projected/90c2ee96-d6a2-4231-abc4-e9e186375ede-kube-api-access-xq7cf\") pod \"nova-cell0-conductor-db-sync-ln9pv\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:13:43 crc kubenswrapper[4937]: I0225 16:13:43.187761 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-config-data\") pod \"nova-cell0-conductor-db-sync-ln9pv\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:13:43 crc kubenswrapper[4937]: I0225 16:13:43.233607 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:13:44 crc kubenswrapper[4937]: E0225 16:13:44.528050 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.139858 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.249811 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da32b76e-0420-4e12-8b28-8a865a41d899","Type":"ContainerDied","Data":"265d2d2be741a72e22cd5d8c6b3fbac8b06cd3541eca37d798ce7a1350aacb3f"} Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.249881 4937 scope.go:117] "RemoveContainer" containerID="4ecadf33cfb0137001239dc2fa3d24886888be08c239db6e6a840a35347849c1" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.249881 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.259600 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:13:45 crc kubenswrapper[4937]: E0225 16:13:45.259884 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.299030 4937 scope.go:117] "RemoveContainer" containerID="1ec406ed044c94889f7711069205d579fbf677ca16fffde51f2ebc94f444aaf1" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.328707 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-config-data-custom\") pod \"da32b76e-0420-4e12-8b28-8a865a41d899\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.328780 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgmzj\" (UniqueName: \"kubernetes.io/projected/da32b76e-0420-4e12-8b28-8a865a41d899-kube-api-access-zgmzj\") pod \"da32b76e-0420-4e12-8b28-8a865a41d899\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.328807 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-scripts\") pod \"da32b76e-0420-4e12-8b28-8a865a41d899\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.328991 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da32b76e-0420-4e12-8b28-8a865a41d899-logs\") pod \"da32b76e-0420-4e12-8b28-8a865a41d899\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.329055 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-combined-ca-bundle\") pod \"da32b76e-0420-4e12-8b28-8a865a41d899\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.329081 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da32b76e-0420-4e12-8b28-8a865a41d899-etc-machine-id\") pod \"da32b76e-0420-4e12-8b28-8a865a41d899\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.329104 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-config-data\") pod \"da32b76e-0420-4e12-8b28-8a865a41d899\" (UID: \"da32b76e-0420-4e12-8b28-8a865a41d899\") " Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.331402 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da32b76e-0420-4e12-8b28-8a865a41d899-logs" (OuterVolumeSpecName: "logs") pod "da32b76e-0420-4e12-8b28-8a865a41d899" (UID: "da32b76e-0420-4e12-8b28-8a865a41d899"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.333835 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da32b76e-0420-4e12-8b28-8a865a41d899-kube-api-access-zgmzj" (OuterVolumeSpecName: "kube-api-access-zgmzj") pod "da32b76e-0420-4e12-8b28-8a865a41d899" (UID: "da32b76e-0420-4e12-8b28-8a865a41d899"). InnerVolumeSpecName "kube-api-access-zgmzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.334671 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da32b76e-0420-4e12-8b28-8a865a41d899-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "da32b76e-0420-4e12-8b28-8a865a41d899" (UID: "da32b76e-0420-4e12-8b28-8a865a41d899"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.334838 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-scripts" (OuterVolumeSpecName: "scripts") pod "da32b76e-0420-4e12-8b28-8a865a41d899" (UID: "da32b76e-0420-4e12-8b28-8a865a41d899"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.337195 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da32b76e-0420-4e12-8b28-8a865a41d899" (UID: "da32b76e-0420-4e12-8b28-8a865a41d899"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.389097 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da32b76e-0420-4e12-8b28-8a865a41d899" (UID: "da32b76e-0420-4e12-8b28-8a865a41d899"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.418735 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-config-data" (OuterVolumeSpecName: "config-data") pod "da32b76e-0420-4e12-8b28-8a865a41d899" (UID: "da32b76e-0420-4e12-8b28-8a865a41d899"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.431598 4937 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da32b76e-0420-4e12-8b28-8a865a41d899-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.431927 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.431941 4937 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.431954 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgmzj\" (UniqueName: \"kubernetes.io/projected/da32b76e-0420-4e12-8b28-8a865a41d899-kube-api-access-zgmzj\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.431990 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.432002 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da32b76e-0420-4e12-8b28-8a865a41d899-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.432014 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da32b76e-0420-4e12-8b28-8a865a41d899-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.708496 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.722523 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.748038 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.812729 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 25 16:13:45 crc kubenswrapper[4937]: E0225 16:13:45.813207 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da32b76e-0420-4e12-8b28-8a865a41d899" containerName="cinder-api-log" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.813223 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="da32b76e-0420-4e12-8b28-8a865a41d899" containerName="cinder-api-log" Feb 25 16:13:45 crc kubenswrapper[4937]: E0225 16:13:45.813238 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da32b76e-0420-4e12-8b28-8a865a41d899" containerName="cinder-api" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.813246 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="da32b76e-0420-4e12-8b28-8a865a41d899" containerName="cinder-api" Feb 25 16:13:45 crc kubenswrapper[4937]: E0225 16:13:45.813262 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00113672-6314-4271-b571-682e18b9a920" containerName="neutron-httpd" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.813269 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="00113672-6314-4271-b571-682e18b9a920" containerName="neutron-httpd" Feb 25 16:13:45 crc kubenswrapper[4937]: E0225 16:13:45.813286 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00113672-6314-4271-b571-682e18b9a920" containerName="neutron-api" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.813294 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="00113672-6314-4271-b571-682e18b9a920" containerName="neutron-api" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.813511 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="00113672-6314-4271-b571-682e18b9a920" containerName="neutron-httpd" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.813535 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="da32b76e-0420-4e12-8b28-8a865a41d899" containerName="cinder-api-log" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.813544 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="00113672-6314-4271-b571-682e18b9a920" containerName="neutron-api" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.813558 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="da32b76e-0420-4e12-8b28-8a865a41d899" containerName="cinder-api" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.820411 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.824456 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.824709 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.824897 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.827881 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.844327 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ln9pv"] Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.849318 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-ovndb-tls-certs\") pod \"00113672-6314-4271-b571-682e18b9a920\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.849409 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9xvs\" (UniqueName: \"kubernetes.io/projected/00113672-6314-4271-b571-682e18b9a920-kube-api-access-j9xvs\") pod \"00113672-6314-4271-b571-682e18b9a920\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.849591 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-combined-ca-bundle\") pod \"00113672-6314-4271-b571-682e18b9a920\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.849715 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-config\") pod \"00113672-6314-4271-b571-682e18b9a920\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.849754 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-httpd-config\") pod \"00113672-6314-4271-b571-682e18b9a920\" (UID: \"00113672-6314-4271-b571-682e18b9a920\") " Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.861384 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00113672-6314-4271-b571-682e18b9a920-kube-api-access-j9xvs" (OuterVolumeSpecName: "kube-api-access-j9xvs") pod "00113672-6314-4271-b571-682e18b9a920" (UID: "00113672-6314-4271-b571-682e18b9a920"). InnerVolumeSpecName "kube-api-access-j9xvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.874336 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "00113672-6314-4271-b571-682e18b9a920" (UID: "00113672-6314-4271-b571-682e18b9a920"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.926806 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00113672-6314-4271-b571-682e18b9a920" (UID: "00113672-6314-4271-b571-682e18b9a920"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.926840 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-config" (OuterVolumeSpecName: "config") pod "00113672-6314-4271-b571-682e18b9a920" (UID: "00113672-6314-4271-b571-682e18b9a920"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.951916 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.952000 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-config-data-custom\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.952021 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.952049 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.952109 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-config-data\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.952145 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5716a1da-2a42-48cd-96cd-149adb030006-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.952188 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-scripts\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.952238 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsm55\" (UniqueName: \"kubernetes.io/projected/5716a1da-2a42-48cd-96cd-149adb030006-kube-api-access-hsm55\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.952271 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5716a1da-2a42-48cd-96cd-149adb030006-logs\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.952359 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9xvs\" (UniqueName: \"kubernetes.io/projected/00113672-6314-4271-b571-682e18b9a920-kube-api-access-j9xvs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.952372 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.952383 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.952396 4937 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:45 crc kubenswrapper[4937]: I0225 16:13:45.957476 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "00113672-6314-4271-b571-682e18b9a920" (UID: "00113672-6314-4271-b571-682e18b9a920"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.054101 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.054357 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-config-data-custom\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.054376 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.054397 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.054440 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-config-data\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.054463 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5716a1da-2a42-48cd-96cd-149adb030006-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.054510 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-scripts\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.054544 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsm55\" (UniqueName: \"kubernetes.io/projected/5716a1da-2a42-48cd-96cd-149adb030006-kube-api-access-hsm55\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.054572 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5716a1da-2a42-48cd-96cd-149adb030006-logs\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.054622 4937 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00113672-6314-4271-b571-682e18b9a920-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.054988 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5716a1da-2a42-48cd-96cd-149adb030006-logs\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.055041 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5716a1da-2a42-48cd-96cd-149adb030006-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.060372 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.060676 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-config-data\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.061875 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.062954 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-config-data-custom\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.063292 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.063855 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5716a1da-2a42-48cd-96cd-149adb030006-scripts\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.079055 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsm55\" (UniqueName: \"kubernetes.io/projected/5716a1da-2a42-48cd-96cd-149adb030006-kube-api-access-hsm55\") pod \"cinder-api-0\" (UID: \"5716a1da-2a42-48cd-96cd-149adb030006\") " pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.229703 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.274435 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"438c5ff0-7419-49e9-848e-bbdf17319694","Type":"ContainerStarted","Data":"404be641ffbe1a4cddb546bf5af038785cf336831297e16051f1319415ccebd7"} Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.278751 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ln9pv" event={"ID":"90c2ee96-d6a2-4231-abc4-e9e186375ede","Type":"ContainerStarted","Data":"8549ccf4eab0082e1926b1f86ac8d108c3ac2e1093c208e5b1841c0ef9afc792"} Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.285996 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f86fff94d-29bhj" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.287303 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f86fff94d-29bhj" event={"ID":"00113672-6314-4271-b571-682e18b9a920","Type":"ContainerDied","Data":"036dc01cd54d6e8d73a02603e36cbf1736890b63d5b1b877422881489a96aacb"} Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.287382 4937 scope.go:117] "RemoveContainer" containerID="9588a5557ba29747508d1d9d1cd9fa4955e2495838a9f652499423a47557491d" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.300501 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a03635e4-24a3-460b-ab0e-e3f677ac95c5","Type":"ContainerStarted","Data":"f2ea1f5c4380a6714e90f5c5238a8bdfc99a3157b3a91ced22b01206c7bf8f66"} Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.334673 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.577629065 podStartE2EDuration="24.334638969s" podCreationTimestamp="2026-02-25 16:13:22 +0000 UTC" firstStartedPulling="2026-02-25 16:13:23.993818791 +0000 UTC m=+1655.007210681" lastFinishedPulling="2026-02-25 16:13:44.750828685 +0000 UTC m=+1675.764220585" observedRunningTime="2026-02-25 16:13:46.32029077 +0000 UTC m=+1677.333682660" watchObservedRunningTime="2026-02-25 16:13:46.334638969 +0000 UTC m=+1677.348030859" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.344228 4937 scope.go:117] "RemoveContainer" containerID="b8afa98b9fa3cd878e6f97dd5d597e90939e2b1f892fd059bb1d43f89758f3f6" Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.357874 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f86fff94d-29bhj"] Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.425672 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f86fff94d-29bhj"] Feb 25 16:13:46 crc kubenswrapper[4937]: I0225 16:13:46.829549 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 25 16:13:46 crc kubenswrapper[4937]: W0225 16:13:46.846962 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5716a1da_2a42_48cd_96cd_149adb030006.slice/crio-5a35e3254b84bb9e98967abf58891127b226a6b7d5d752db653da14de978ce27 WatchSource:0}: Error finding container 5a35e3254b84bb9e98967abf58891127b226a6b7d5d752db653da14de978ce27: Status 404 returned error can't find the container with id 5a35e3254b84bb9e98967abf58891127b226a6b7d5d752db653da14de978ce27 Feb 25 16:13:47 crc kubenswrapper[4937]: I0225 16:13:47.023917 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-754k4" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="registry-server" probeResult="failure" output=< Feb 25 16:13:47 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 16:13:47 crc kubenswrapper[4937]: > Feb 25 16:13:47 crc kubenswrapper[4937]: I0225 16:13:47.317296 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5716a1da-2a42-48cd-96cd-149adb030006","Type":"ContainerStarted","Data":"5a35e3254b84bb9e98967abf58891127b226a6b7d5d752db653da14de978ce27"} Feb 25 16:13:47 crc kubenswrapper[4937]: I0225 16:13:47.382621 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00113672-6314-4271-b571-682e18b9a920" path="/var/lib/kubelet/pods/00113672-6314-4271-b571-682e18b9a920/volumes" Feb 25 16:13:47 crc kubenswrapper[4937]: I0225 16:13:47.384135 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da32b76e-0420-4e12-8b28-8a865a41d899" path="/var/lib/kubelet/pods/da32b76e-0420-4e12-8b28-8a865a41d899/volumes" Feb 25 16:13:48 crc kubenswrapper[4937]: I0225 16:13:48.331900 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5716a1da-2a42-48cd-96cd-149adb030006","Type":"ContainerStarted","Data":"06bab0ce0595a44c7a6f2aee50b739a25d65a65d17ffee68430d50f6a27cbc8b"} Feb 25 16:13:48 crc kubenswrapper[4937]: I0225 16:13:48.335745 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"438c5ff0-7419-49e9-848e-bbdf17319694","Type":"ContainerStarted","Data":"3b094323599b6b9d3cc1f898590dff0fe7a16fb1bcf4cdd15e846f883f7cb5a7"} Feb 25 16:13:48 crc kubenswrapper[4937]: I0225 16:13:48.335939 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 16:13:48 crc kubenswrapper[4937]: I0225 16:13:48.359197 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.743121455 podStartE2EDuration="14.359177342s" podCreationTimestamp="2026-02-25 16:13:34 +0000 UTC" firstStartedPulling="2026-02-25 16:13:36.799278692 +0000 UTC m=+1667.812670582" lastFinishedPulling="2026-02-25 16:13:47.415334579 +0000 UTC m=+1678.428726469" observedRunningTime="2026-02-25 16:13:48.354136256 +0000 UTC m=+1679.367528146" watchObservedRunningTime="2026-02-25 16:13:48.359177342 +0000 UTC m=+1679.372569232" Feb 25 16:13:49 crc kubenswrapper[4937]: I0225 16:13:49.359118 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5716a1da-2a42-48cd-96cd-149adb030006","Type":"ContainerStarted","Data":"aec3e9c454551a02ca8a6d8ef293cbf01e519ffc3acbd892002196153ecb769f"} Feb 25 16:13:49 crc kubenswrapper[4937]: I0225 16:13:49.359884 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 25 16:13:49 crc kubenswrapper[4937]: I0225 16:13:49.391825 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.39181159 podStartE2EDuration="4.39181159s" podCreationTimestamp="2026-02-25 16:13:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:13:49.386870026 +0000 UTC m=+1680.400261916" watchObservedRunningTime="2026-02-25 16:13:49.39181159 +0000 UTC m=+1680.405203480" Feb 25 16:13:52 crc kubenswrapper[4937]: I0225 16:13:52.341947 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:52 crc kubenswrapper[4937]: I0225 16:13:52.342606 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="ceilometer-central-agent" containerID="cri-o://0dc67156cd194690fa7ce4eb52a640c8a52cff5bbc86265ed5f3e6b26a5ea247" gracePeriod=30 Feb 25 16:13:52 crc kubenswrapper[4937]: I0225 16:13:52.342945 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="proxy-httpd" containerID="cri-o://3b094323599b6b9d3cc1f898590dff0fe7a16fb1bcf4cdd15e846f883f7cb5a7" gracePeriod=30 Feb 25 16:13:52 crc kubenswrapper[4937]: I0225 16:13:52.343143 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="sg-core" containerID="cri-o://404be641ffbe1a4cddb546bf5af038785cf336831297e16051f1319415ccebd7" gracePeriod=30 Feb 25 16:13:52 crc kubenswrapper[4937]: I0225 16:13:52.343198 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="ceilometer-notification-agent" containerID="cri-o://cbde4de279afd299e5f9e009f62ba9bd36720a5e8d214198e7957d4808a98920" gracePeriod=30 Feb 25 16:13:53 crc kubenswrapper[4937]: I0225 16:13:53.414468 4937 generic.go:334] "Generic (PLEG): container finished" podID="438c5ff0-7419-49e9-848e-bbdf17319694" containerID="3b094323599b6b9d3cc1f898590dff0fe7a16fb1bcf4cdd15e846f883f7cb5a7" exitCode=0 Feb 25 16:13:53 crc kubenswrapper[4937]: I0225 16:13:53.414827 4937 generic.go:334] "Generic (PLEG): container finished" podID="438c5ff0-7419-49e9-848e-bbdf17319694" containerID="404be641ffbe1a4cddb546bf5af038785cf336831297e16051f1319415ccebd7" exitCode=2 Feb 25 16:13:53 crc kubenswrapper[4937]: I0225 16:13:53.414839 4937 generic.go:334] "Generic (PLEG): container finished" podID="438c5ff0-7419-49e9-848e-bbdf17319694" containerID="0dc67156cd194690fa7ce4eb52a640c8a52cff5bbc86265ed5f3e6b26a5ea247" exitCode=0 Feb 25 16:13:53 crc kubenswrapper[4937]: I0225 16:13:53.414705 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"438c5ff0-7419-49e9-848e-bbdf17319694","Type":"ContainerDied","Data":"3b094323599b6b9d3cc1f898590dff0fe7a16fb1bcf4cdd15e846f883f7cb5a7"} Feb 25 16:13:53 crc kubenswrapper[4937]: I0225 16:13:53.414875 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"438c5ff0-7419-49e9-848e-bbdf17319694","Type":"ContainerDied","Data":"404be641ffbe1a4cddb546bf5af038785cf336831297e16051f1319415ccebd7"} Feb 25 16:13:53 crc kubenswrapper[4937]: I0225 16:13:53.414889 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"438c5ff0-7419-49e9-848e-bbdf17319694","Type":"ContainerDied","Data":"0dc67156cd194690fa7ce4eb52a640c8a52cff5bbc86265ed5f3e6b26a5ea247"} Feb 25 16:13:53 crc kubenswrapper[4937]: I0225 16:13:53.857804 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:13:53 crc kubenswrapper[4937]: I0225 16:13:53.858309 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fac18368-fe1e-4431-bf59-1c1e613bc0d6" containerName="glance-log" containerID="cri-o://62af60d64b8da5ff29d5b9384e40735a85c436fea31e028ac31eff1f534c467f" gracePeriod=30 Feb 25 16:13:53 crc kubenswrapper[4937]: I0225 16:13:53.858410 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fac18368-fe1e-4431-bf59-1c1e613bc0d6" containerName="glance-httpd" containerID="cri-o://8ba797b8b788cb29e8b568c54553617699937dbc48e9377eb47b38e1dacbdf57" gracePeriod=30 Feb 25 16:13:54 crc kubenswrapper[4937]: I0225 16:13:54.439862 4937 generic.go:334] "Generic (PLEG): container finished" podID="fac18368-fe1e-4431-bf59-1c1e613bc0d6" containerID="62af60d64b8da5ff29d5b9384e40735a85c436fea31e028ac31eff1f534c467f" exitCode=143 Feb 25 16:13:54 crc kubenswrapper[4937]: I0225 16:13:54.439917 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fac18368-fe1e-4431-bf59-1c1e613bc0d6","Type":"ContainerDied","Data":"62af60d64b8da5ff29d5b9384e40735a85c436fea31e028ac31eff1f534c467f"} Feb 25 16:13:55 crc kubenswrapper[4937]: I0225 16:13:55.095344 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:13:55 crc kubenswrapper[4937]: I0225 16:13:55.095999 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c021e5b5-9038-4a91-8785-7461a1d3c981" containerName="glance-log" containerID="cri-o://81f89b2c2a68d55c05964191735c5787af8dbacc9320d0fab4fbce2e13c1458b" gracePeriod=30 Feb 25 16:13:55 crc kubenswrapper[4937]: I0225 16:13:55.096070 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c021e5b5-9038-4a91-8785-7461a1d3c981" containerName="glance-httpd" containerID="cri-o://6e1220f05b516229b945ce6ad87ff9617b4e5a496cf0806562e0602335e6f092" gracePeriod=30 Feb 25 16:13:55 crc kubenswrapper[4937]: I0225 16:13:55.474795 4937 generic.go:334] "Generic (PLEG): container finished" podID="c021e5b5-9038-4a91-8785-7461a1d3c981" containerID="81f89b2c2a68d55c05964191735c5787af8dbacc9320d0fab4fbce2e13c1458b" exitCode=143 Feb 25 16:13:55 crc kubenswrapper[4937]: I0225 16:13:55.474848 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c021e5b5-9038-4a91-8785-7461a1d3c981","Type":"ContainerDied","Data":"81f89b2c2a68d55c05964191735c5787af8dbacc9320d0fab4fbce2e13c1458b"} Feb 25 16:13:55 crc kubenswrapper[4937]: I0225 16:13:55.477393 4937 generic.go:334] "Generic (PLEG): container finished" podID="438c5ff0-7419-49e9-848e-bbdf17319694" containerID="cbde4de279afd299e5f9e009f62ba9bd36720a5e8d214198e7957d4808a98920" exitCode=0 Feb 25 16:13:55 crc kubenswrapper[4937]: I0225 16:13:55.477417 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"438c5ff0-7419-49e9-848e-bbdf17319694","Type":"ContainerDied","Data":"cbde4de279afd299e5f9e009f62ba9bd36720a5e8d214198e7957d4808a98920"} Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.034135 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.087203 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.269583 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-754k4"] Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.503085 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ln9pv" event={"ID":"90c2ee96-d6a2-4231-abc4-e9e186375ede","Type":"ContainerStarted","Data":"7f22c4ab562bfd37ac3a88a1518f26c12f915580c0e99b5e1d8cae72324a64de"} Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.701923 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.733193 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ln9pv" podStartSLOduration=4.378519428 podStartE2EDuration="14.733169007s" podCreationTimestamp="2026-02-25 16:13:42 +0000 UTC" firstStartedPulling="2026-02-25 16:13:45.815912525 +0000 UTC m=+1676.829304425" lastFinishedPulling="2026-02-25 16:13:56.170562114 +0000 UTC m=+1687.183954004" observedRunningTime="2026-02-25 16:13:56.539566218 +0000 UTC m=+1687.552958108" watchObservedRunningTime="2026-02-25 16:13:56.733169007 +0000 UTC m=+1687.746560897" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.810963 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-scripts\") pod \"438c5ff0-7419-49e9-848e-bbdf17319694\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.811037 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v6s6\" (UniqueName: \"kubernetes.io/projected/438c5ff0-7419-49e9-848e-bbdf17319694-kube-api-access-6v6s6\") pod \"438c5ff0-7419-49e9-848e-bbdf17319694\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.811061 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-config-data\") pod \"438c5ff0-7419-49e9-848e-bbdf17319694\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.811111 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438c5ff0-7419-49e9-848e-bbdf17319694-log-httpd\") pod \"438c5ff0-7419-49e9-848e-bbdf17319694\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.811182 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438c5ff0-7419-49e9-848e-bbdf17319694-run-httpd\") pod \"438c5ff0-7419-49e9-848e-bbdf17319694\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.811232 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-combined-ca-bundle\") pod \"438c5ff0-7419-49e9-848e-bbdf17319694\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.811261 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-sg-core-conf-yaml\") pod \"438c5ff0-7419-49e9-848e-bbdf17319694\" (UID: \"438c5ff0-7419-49e9-848e-bbdf17319694\") " Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.811510 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/438c5ff0-7419-49e9-848e-bbdf17319694-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "438c5ff0-7419-49e9-848e-bbdf17319694" (UID: "438c5ff0-7419-49e9-848e-bbdf17319694"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.811525 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/438c5ff0-7419-49e9-848e-bbdf17319694-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "438c5ff0-7419-49e9-848e-bbdf17319694" (UID: "438c5ff0-7419-49e9-848e-bbdf17319694"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.812258 4937 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438c5ff0-7419-49e9-848e-bbdf17319694-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.812281 4937 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438c5ff0-7419-49e9-848e-bbdf17319694-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.818558 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438c5ff0-7419-49e9-848e-bbdf17319694-kube-api-access-6v6s6" (OuterVolumeSpecName: "kube-api-access-6v6s6") pod "438c5ff0-7419-49e9-848e-bbdf17319694" (UID: "438c5ff0-7419-49e9-848e-bbdf17319694"). InnerVolumeSpecName "kube-api-access-6v6s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.820626 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-scripts" (OuterVolumeSpecName: "scripts") pod "438c5ff0-7419-49e9-848e-bbdf17319694" (UID: "438c5ff0-7419-49e9-848e-bbdf17319694"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.845607 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "438c5ff0-7419-49e9-848e-bbdf17319694" (UID: "438c5ff0-7419-49e9-848e-bbdf17319694"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.914204 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.914238 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v6s6\" (UniqueName: \"kubernetes.io/projected/438c5ff0-7419-49e9-848e-bbdf17319694-kube-api-access-6v6s6\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.914248 4937 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.916943 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "438c5ff0-7419-49e9-848e-bbdf17319694" (UID: "438c5ff0-7419-49e9-848e-bbdf17319694"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:56 crc kubenswrapper[4937]: I0225 16:13:56.956717 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-config-data" (OuterVolumeSpecName: "config-data") pod "438c5ff0-7419-49e9-848e-bbdf17319694" (UID: "438c5ff0-7419-49e9-848e-bbdf17319694"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.016568 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.016603 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438c5ff0-7419-49e9-848e-bbdf17319694-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.368417 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:13:57 crc kubenswrapper[4937]: E0225 16:13:57.368809 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.527816 4937 generic.go:334] "Generic (PLEG): container finished" podID="fac18368-fe1e-4431-bf59-1c1e613bc0d6" containerID="8ba797b8b788cb29e8b568c54553617699937dbc48e9377eb47b38e1dacbdf57" exitCode=0 Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.528216 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fac18368-fe1e-4431-bf59-1c1e613bc0d6","Type":"ContainerDied","Data":"8ba797b8b788cb29e8b568c54553617699937dbc48e9377eb47b38e1dacbdf57"} Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.560908 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-754k4" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="registry-server" containerID="cri-o://b1bc4e0f1cd6278c6a9db02c66208f3ed93fc844943a3b5938767508183297ad" gracePeriod=2 Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.561194 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.561940 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"438c5ff0-7419-49e9-848e-bbdf17319694","Type":"ContainerDied","Data":"b046cd31a33afa50f931efda73c6947f6129daa72b5bab53dbe2c10bc8a4ada4"} Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.561971 4937 scope.go:117] "RemoveContainer" containerID="3b094323599b6b9d3cc1f898590dff0fe7a16fb1bcf4cdd15e846f883f7cb5a7" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.586969 4937 scope.go:117] "RemoveContainer" containerID="404be641ffbe1a4cddb546bf5af038785cf336831297e16051f1319415ccebd7" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.596562 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.622128 4937 scope.go:117] "RemoveContainer" containerID="cbde4de279afd299e5f9e009f62ba9bd36720a5e8d214198e7957d4808a98920" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.633598 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.649595 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:57 crc kubenswrapper[4937]: E0225 16:13:57.650049 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="ceilometer-central-agent" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.650068 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="ceilometer-central-agent" Feb 25 16:13:57 crc kubenswrapper[4937]: E0225 16:13:57.650106 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="proxy-httpd" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.650113 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="proxy-httpd" Feb 25 16:13:57 crc kubenswrapper[4937]: E0225 16:13:57.650123 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="sg-core" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.650130 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="sg-core" Feb 25 16:13:57 crc kubenswrapper[4937]: E0225 16:13:57.650147 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="ceilometer-notification-agent" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.650154 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="ceilometer-notification-agent" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.650330 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="proxy-httpd" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.650343 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="ceilometer-notification-agent" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.650365 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="ceilometer-central-agent" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.650378 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" containerName="sg-core" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.652179 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.652291 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.653428 4937 scope.go:117] "RemoveContainer" containerID="0dc67156cd194690fa7ce4eb52a640c8a52cff5bbc86265ed5f3e6b26a5ea247" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.658717 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.658824 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.740958 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzflf\" (UniqueName: \"kubernetes.io/projected/197a7c87-0333-4170-b28b-df07e57347fc-kube-api-access-gzflf\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.741020 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/197a7c87-0333-4170-b28b-df07e57347fc-log-httpd\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.741189 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-scripts\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.741421 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.741522 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-config-data\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.741679 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.741714 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/197a7c87-0333-4170-b28b-df07e57347fc-run-httpd\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.796256 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.843284 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-public-tls-certs\") pod \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.843670 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.843724 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-config-data\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.843808 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.843843 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/197a7c87-0333-4170-b28b-df07e57347fc-run-httpd\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.843921 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzflf\" (UniqueName: \"kubernetes.io/projected/197a7c87-0333-4170-b28b-df07e57347fc-kube-api-access-gzflf\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.843980 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/197a7c87-0333-4170-b28b-df07e57347fc-log-httpd\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.844026 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-scripts\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.845673 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/197a7c87-0333-4170-b28b-df07e57347fc-run-httpd\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.847328 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/197a7c87-0333-4170-b28b-df07e57347fc-log-httpd\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.852002 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.856355 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-scripts\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.870102 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-config-data\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.881289 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.893223 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzflf\" (UniqueName: \"kubernetes.io/projected/197a7c87-0333-4170-b28b-df07e57347fc-kube-api-access-gzflf\") pod \"ceilometer-0\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " pod="openstack/ceilometer-0" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.948870 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fac18368-fe1e-4431-bf59-1c1e613bc0d6-httpd-run\") pod \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.949900 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") pod \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.950219 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjgzz\" (UniqueName: \"kubernetes.io/projected/fac18368-fe1e-4431-bf59-1c1e613bc0d6-kube-api-access-cjgzz\") pod \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.950259 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fac18368-fe1e-4431-bf59-1c1e613bc0d6-logs\") pod \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.950297 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-scripts\") pod \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.950328 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-combined-ca-bundle\") pod \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.950355 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-config-data\") pod \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\" (UID: \"fac18368-fe1e-4431-bf59-1c1e613bc0d6\") " Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.949771 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fac18368-fe1e-4431-bf59-1c1e613bc0d6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fac18368-fe1e-4431-bf59-1c1e613bc0d6" (UID: "fac18368-fe1e-4431-bf59-1c1e613bc0d6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.950924 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fac18368-fe1e-4431-bf59-1c1e613bc0d6-logs" (OuterVolumeSpecName: "logs") pod "fac18368-fe1e-4431-bf59-1c1e613bc0d6" (UID: "fac18368-fe1e-4431-bf59-1c1e613bc0d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.962696 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-scripts" (OuterVolumeSpecName: "scripts") pod "fac18368-fe1e-4431-bf59-1c1e613bc0d6" (UID: "fac18368-fe1e-4431-bf59-1c1e613bc0d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.965370 4937 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fac18368-fe1e-4431-bf59-1c1e613bc0d6-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.965394 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fac18368-fe1e-4431-bf59-1c1e613bc0d6-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.965405 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:57 crc kubenswrapper[4937]: I0225 16:13:57.994727 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac18368-fe1e-4431-bf59-1c1e613bc0d6-kube-api-access-cjgzz" (OuterVolumeSpecName: "kube-api-access-cjgzz") pod "fac18368-fe1e-4431-bf59-1c1e613bc0d6" (UID: "fac18368-fe1e-4431-bf59-1c1e613bc0d6"). InnerVolumeSpecName "kube-api-access-cjgzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.011006 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fac18368-fe1e-4431-bf59-1c1e613bc0d6" (UID: "fac18368-fe1e-4431-bf59-1c1e613bc0d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.043706 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-config-data" (OuterVolumeSpecName: "config-data") pod "fac18368-fe1e-4431-bf59-1c1e613bc0d6" (UID: "fac18368-fe1e-4431-bf59-1c1e613bc0d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.065650 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fac18368-fe1e-4431-bf59-1c1e613bc0d6" (UID: "fac18368-fe1e-4431-bf59-1c1e613bc0d6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.067613 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjgzz\" (UniqueName: \"kubernetes.io/projected/fac18368-fe1e-4431-bf59-1c1e613bc0d6-kube-api-access-cjgzz\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.067648 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.067662 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.067674 4937 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fac18368-fe1e-4431-bf59-1c1e613bc0d6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.096031 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.133898 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.272061 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82cf475b-cc29-4d90-a1c4-73e0170f0f48-catalog-content\") pod \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\" (UID: \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\") " Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.272589 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82cf475b-cc29-4d90-a1c4-73e0170f0f48-utilities\") pod \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\" (UID: \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\") " Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.272714 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztnnx\" (UniqueName: \"kubernetes.io/projected/82cf475b-cc29-4d90-a1c4-73e0170f0f48-kube-api-access-ztnnx\") pod \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\" (UID: \"82cf475b-cc29-4d90-a1c4-73e0170f0f48\") " Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.273911 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6" (OuterVolumeSpecName: "glance") pod "fac18368-fe1e-4431-bf59-1c1e613bc0d6" (UID: "fac18368-fe1e-4431-bf59-1c1e613bc0d6"). InnerVolumeSpecName "pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.275052 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82cf475b-cc29-4d90-a1c4-73e0170f0f48-utilities" (OuterVolumeSpecName: "utilities") pod "82cf475b-cc29-4d90-a1c4-73e0170f0f48" (UID: "82cf475b-cc29-4d90-a1c4-73e0170f0f48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.281691 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82cf475b-cc29-4d90-a1c4-73e0170f0f48-kube-api-access-ztnnx" (OuterVolumeSpecName: "kube-api-access-ztnnx") pod "82cf475b-cc29-4d90-a1c4-73e0170f0f48" (UID: "82cf475b-cc29-4d90-a1c4-73e0170f0f48"). InnerVolumeSpecName "kube-api-access-ztnnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.375773 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82cf475b-cc29-4d90-a1c4-73e0170f0f48-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.376149 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztnnx\" (UniqueName: \"kubernetes.io/projected/82cf475b-cc29-4d90-a1c4-73e0170f0f48-kube-api-access-ztnnx\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.376225 4937 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") on node \"crc\" " Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.432367 4937 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.432912 4937 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6") on node "crc" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.437618 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82cf475b-cc29-4d90-a1c4-73e0170f0f48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82cf475b-cc29-4d90-a1c4-73e0170f0f48" (UID: "82cf475b-cc29-4d90-a1c4-73e0170f0f48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.485550 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82cf475b-cc29-4d90-a1c4-73e0170f0f48-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.485584 4937 reconciler_common.go:293] "Volume detached for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.580749 4937 generic.go:334] "Generic (PLEG): container finished" podID="c021e5b5-9038-4a91-8785-7461a1d3c981" containerID="6e1220f05b516229b945ce6ad87ff9617b4e5a496cf0806562e0602335e6f092" exitCode=0 Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.581726 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c021e5b5-9038-4a91-8785-7461a1d3c981","Type":"ContainerDied","Data":"6e1220f05b516229b945ce6ad87ff9617b4e5a496cf0806562e0602335e6f092"} Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.584324 4937 generic.go:334] "Generic (PLEG): container finished" podID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerID="b1bc4e0f1cd6278c6a9db02c66208f3ed93fc844943a3b5938767508183297ad" exitCode=0 Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.584509 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-754k4" event={"ID":"82cf475b-cc29-4d90-a1c4-73e0170f0f48","Type":"ContainerDied","Data":"b1bc4e0f1cd6278c6a9db02c66208f3ed93fc844943a3b5938767508183297ad"} Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.584630 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-754k4" event={"ID":"82cf475b-cc29-4d90-a1c4-73e0170f0f48","Type":"ContainerDied","Data":"1ed260763928b99b6d2133ce3f4b71d78aa05aa1ba38e1c597ef7b2c8493166e"} Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.584697 4937 scope.go:117] "RemoveContainer" containerID="b1bc4e0f1cd6278c6a9db02c66208f3ed93fc844943a3b5938767508183297ad" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.584899 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-754k4" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.604676 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.604821 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fac18368-fe1e-4431-bf59-1c1e613bc0d6","Type":"ContainerDied","Data":"ea8f093f290906ca0d673b66786333b2793cfe8a4754f9c0cd6c31226fc0e749"} Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.632388 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-754k4"] Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.662865 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-754k4"] Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.682158 4937 scope.go:117] "RemoveContainer" containerID="fd138b73791dbd412c40686640180fa4e8e5a765bff1a7628fbfdfa25ff4a202" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.685706 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.709335 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.727029 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:13:58 crc kubenswrapper[4937]: E0225 16:13:58.727472 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac18368-fe1e-4431-bf59-1c1e613bc0d6" containerName="glance-log" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.727486 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac18368-fe1e-4431-bf59-1c1e613bc0d6" containerName="glance-log" Feb 25 16:13:58 crc kubenswrapper[4937]: E0225 16:13:58.741633 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="registry-server" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.741668 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="registry-server" Feb 25 16:13:58 crc kubenswrapper[4937]: E0225 16:13:58.741690 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac18368-fe1e-4431-bf59-1c1e613bc0d6" containerName="glance-httpd" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.741697 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac18368-fe1e-4431-bf59-1c1e613bc0d6" containerName="glance-httpd" Feb 25 16:13:58 crc kubenswrapper[4937]: E0225 16:13:58.741710 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="extract-utilities" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.741716 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="extract-utilities" Feb 25 16:13:58 crc kubenswrapper[4937]: E0225 16:13:58.741766 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="extract-content" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.741772 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="extract-content" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.742098 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac18368-fe1e-4431-bf59-1c1e613bc0d6" containerName="glance-log" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.742115 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" containerName="registry-server" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.742122 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac18368-fe1e-4431-bf59-1c1e613bc0d6" containerName="glance-httpd" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.743211 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.755231 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.755573 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.758637 4937 scope.go:117] "RemoveContainer" containerID="5f4ed79ccc2b234a018f87b53b79e7cc63f7de11f0f0c3c27230bd9a989aae2c" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.760488 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.834592 4937 scope.go:117] "RemoveContainer" containerID="b1bc4e0f1cd6278c6a9db02c66208f3ed93fc844943a3b5938767508183297ad" Feb 25 16:13:58 crc kubenswrapper[4937]: E0225 16:13:58.836973 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1bc4e0f1cd6278c6a9db02c66208f3ed93fc844943a3b5938767508183297ad\": container with ID starting with b1bc4e0f1cd6278c6a9db02c66208f3ed93fc844943a3b5938767508183297ad not found: ID does not exist" containerID="b1bc4e0f1cd6278c6a9db02c66208f3ed93fc844943a3b5938767508183297ad" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.837042 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1bc4e0f1cd6278c6a9db02c66208f3ed93fc844943a3b5938767508183297ad"} err="failed to get container status \"b1bc4e0f1cd6278c6a9db02c66208f3ed93fc844943a3b5938767508183297ad\": rpc error: code = NotFound desc = could not find container \"b1bc4e0f1cd6278c6a9db02c66208f3ed93fc844943a3b5938767508183297ad\": container with ID starting with b1bc4e0f1cd6278c6a9db02c66208f3ed93fc844943a3b5938767508183297ad not found: ID does not exist" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.837075 4937 scope.go:117] "RemoveContainer" containerID="fd138b73791dbd412c40686640180fa4e8e5a765bff1a7628fbfdfa25ff4a202" Feb 25 16:13:58 crc kubenswrapper[4937]: E0225 16:13:58.841536 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd138b73791dbd412c40686640180fa4e8e5a765bff1a7628fbfdfa25ff4a202\": container with ID starting with fd138b73791dbd412c40686640180fa4e8e5a765bff1a7628fbfdfa25ff4a202 not found: ID does not exist" containerID="fd138b73791dbd412c40686640180fa4e8e5a765bff1a7628fbfdfa25ff4a202" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.841698 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd138b73791dbd412c40686640180fa4e8e5a765bff1a7628fbfdfa25ff4a202"} err="failed to get container status \"fd138b73791dbd412c40686640180fa4e8e5a765bff1a7628fbfdfa25ff4a202\": rpc error: code = NotFound desc = could not find container \"fd138b73791dbd412c40686640180fa4e8e5a765bff1a7628fbfdfa25ff4a202\": container with ID starting with fd138b73791dbd412c40686640180fa4e8e5a765bff1a7628fbfdfa25ff4a202 not found: ID does not exist" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.841793 4937 scope.go:117] "RemoveContainer" containerID="5f4ed79ccc2b234a018f87b53b79e7cc63f7de11f0f0c3c27230bd9a989aae2c" Feb 25 16:13:58 crc kubenswrapper[4937]: E0225 16:13:58.844409 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f4ed79ccc2b234a018f87b53b79e7cc63f7de11f0f0c3c27230bd9a989aae2c\": container with ID starting with 5f4ed79ccc2b234a018f87b53b79e7cc63f7de11f0f0c3c27230bd9a989aae2c not found: ID does not exist" containerID="5f4ed79ccc2b234a018f87b53b79e7cc63f7de11f0f0c3c27230bd9a989aae2c" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.844513 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f4ed79ccc2b234a018f87b53b79e7cc63f7de11f0f0c3c27230bd9a989aae2c"} err="failed to get container status \"5f4ed79ccc2b234a018f87b53b79e7cc63f7de11f0f0c3c27230bd9a989aae2c\": rpc error: code = NotFound desc = could not find container \"5f4ed79ccc2b234a018f87b53b79e7cc63f7de11f0f0c3c27230bd9a989aae2c\": container with ID starting with 5f4ed79ccc2b234a018f87b53b79e7cc63f7de11f0f0c3c27230bd9a989aae2c not found: ID does not exist" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.844582 4937 scope.go:117] "RemoveContainer" containerID="8ba797b8b788cb29e8b568c54553617699937dbc48e9377eb47b38e1dacbdf57" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.867719 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.898925 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c50d4693-04e4-40a4-a07d-9475ce9b0125-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.898968 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwbnk\" (UniqueName: \"kubernetes.io/projected/c50d4693-04e4-40a4-a07d-9475ce9b0125-kube-api-access-bwbnk\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.898991 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c50d4693-04e4-40a4-a07d-9475ce9b0125-config-data\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.899029 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.899109 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c50d4693-04e4-40a4-a07d-9475ce9b0125-logs\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.899162 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c50d4693-04e4-40a4-a07d-9475ce9b0125-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.899186 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c50d4693-04e4-40a4-a07d-9475ce9b0125-scripts\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.899219 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c50d4693-04e4-40a4-a07d-9475ce9b0125-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:58 crc kubenswrapper[4937]: I0225 16:13:58.923776 4937 scope.go:117] "RemoveContainer" containerID="62af60d64b8da5ff29d5b9384e40735a85c436fea31e028ac31eff1f534c467f" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.002527 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c50d4693-04e4-40a4-a07d-9475ce9b0125-logs\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.002611 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c50d4693-04e4-40a4-a07d-9475ce9b0125-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.002640 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c50d4693-04e4-40a4-a07d-9475ce9b0125-scripts\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.002664 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c50d4693-04e4-40a4-a07d-9475ce9b0125-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.002706 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c50d4693-04e4-40a4-a07d-9475ce9b0125-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.002732 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwbnk\" (UniqueName: \"kubernetes.io/projected/c50d4693-04e4-40a4-a07d-9475ce9b0125-kube-api-access-bwbnk\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.002750 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c50d4693-04e4-40a4-a07d-9475ce9b0125-config-data\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.002811 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.003215 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c50d4693-04e4-40a4-a07d-9475ce9b0125-logs\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.003523 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c50d4693-04e4-40a4-a07d-9475ce9b0125-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.009304 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c50d4693-04e4-40a4-a07d-9475ce9b0125-scripts\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.009585 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c50d4693-04e4-40a4-a07d-9475ce9b0125-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.010297 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c50d4693-04e4-40a4-a07d-9475ce9b0125-config-data\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.010771 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.010809 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/75804d914e821b5e0a9ece92cf1b2b0f3da08753d3294f4ad4199fba37b189f7/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.012450 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c50d4693-04e4-40a4-a07d-9475ce9b0125-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.020969 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwbnk\" (UniqueName: \"kubernetes.io/projected/c50d4693-04e4-40a4-a07d-9475ce9b0125-kube-api-access-bwbnk\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.088711 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6b3e2a0-8811-48b7-a981-73b4a16d59d6\") pod \"glance-default-external-api-0\" (UID: \"c50d4693-04e4-40a4-a07d-9475ce9b0125\") " pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.115391 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.136633 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.310114 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-config-data\") pod \"c021e5b5-9038-4a91-8785-7461a1d3c981\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.310178 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kz8v\" (UniqueName: \"kubernetes.io/projected/c021e5b5-9038-4a91-8785-7461a1d3c981-kube-api-access-6kz8v\") pod \"c021e5b5-9038-4a91-8785-7461a1d3c981\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.310294 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-combined-ca-bundle\") pod \"c021e5b5-9038-4a91-8785-7461a1d3c981\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.310369 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-internal-tls-certs\") pod \"c021e5b5-9038-4a91-8785-7461a1d3c981\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.310435 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-scripts\") pod \"c021e5b5-9038-4a91-8785-7461a1d3c981\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.310636 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") pod \"c021e5b5-9038-4a91-8785-7461a1d3c981\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.310677 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c021e5b5-9038-4a91-8785-7461a1d3c981-httpd-run\") pod \"c021e5b5-9038-4a91-8785-7461a1d3c981\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.310727 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c021e5b5-9038-4a91-8785-7461a1d3c981-logs\") pod \"c021e5b5-9038-4a91-8785-7461a1d3c981\" (UID: \"c021e5b5-9038-4a91-8785-7461a1d3c981\") " Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.323787 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c021e5b5-9038-4a91-8785-7461a1d3c981-logs" (OuterVolumeSpecName: "logs") pod "c021e5b5-9038-4a91-8785-7461a1d3c981" (UID: "c021e5b5-9038-4a91-8785-7461a1d3c981"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.324299 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c021e5b5-9038-4a91-8785-7461a1d3c981-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.331675 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c021e5b5-9038-4a91-8785-7461a1d3c981-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c021e5b5-9038-4a91-8785-7461a1d3c981" (UID: "c021e5b5-9038-4a91-8785-7461a1d3c981"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.334550 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c021e5b5-9038-4a91-8785-7461a1d3c981-kube-api-access-6kz8v" (OuterVolumeSpecName: "kube-api-access-6kz8v") pod "c021e5b5-9038-4a91-8785-7461a1d3c981" (UID: "c021e5b5-9038-4a91-8785-7461a1d3c981"). InnerVolumeSpecName "kube-api-access-6kz8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.343706 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-scripts" (OuterVolumeSpecName: "scripts") pod "c021e5b5-9038-4a91-8785-7461a1d3c981" (UID: "c021e5b5-9038-4a91-8785-7461a1d3c981"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.396036 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c021e5b5-9038-4a91-8785-7461a1d3c981" (UID: "c021e5b5-9038-4a91-8785-7461a1d3c981"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.398890 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="438c5ff0-7419-49e9-848e-bbdf17319694" path="/var/lib/kubelet/pods/438c5ff0-7419-49e9-848e-bbdf17319694/volumes" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.402188 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82cf475b-cc29-4d90-a1c4-73e0170f0f48" path="/var/lib/kubelet/pods/82cf475b-cc29-4d90-a1c4-73e0170f0f48/volumes" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.403531 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fac18368-fe1e-4431-bf59-1c1e613bc0d6" path="/var/lib/kubelet/pods/fac18368-fe1e-4431-bf59-1c1e613bc0d6/volumes" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.418827 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582" (OuterVolumeSpecName: "glance") pod "c021e5b5-9038-4a91-8785-7461a1d3c981" (UID: "c021e5b5-9038-4a91-8785-7461a1d3c981"). InnerVolumeSpecName "pvc-a6cbd030-8970-4e53-96ea-5336de37e582". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.426012 4937 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.426057 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.426090 4937 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") on node \"crc\" " Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.426106 4937 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c021e5b5-9038-4a91-8785-7461a1d3c981-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.426119 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kz8v\" (UniqueName: \"kubernetes.io/projected/c021e5b5-9038-4a91-8785-7461a1d3c981-kube-api-access-6kz8v\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.447815 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-config-data" (OuterVolumeSpecName: "config-data") pod "c021e5b5-9038-4a91-8785-7461a1d3c981" (UID: "c021e5b5-9038-4a91-8785-7461a1d3c981"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.457729 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c021e5b5-9038-4a91-8785-7461a1d3c981" (UID: "c021e5b5-9038-4a91-8785-7461a1d3c981"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.527980 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.528017 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c021e5b5-9038-4a91-8785-7461a1d3c981-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.541712 4937 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.542028 4937 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a6cbd030-8970-4e53-96ea-5336de37e582" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582") on node "crc" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.595986 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.641852 4937 reconciler_common.go:293] "Volume detached for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") on node \"crc\" DevicePath \"\"" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.652972 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.653572 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c021e5b5-9038-4a91-8785-7461a1d3c981","Type":"ContainerDied","Data":"7f79aa15a8ee531ad3e279dbe486515d32169aa55bedb346ce0268c65dea4a2f"} Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.653644 4937 scope.go:117] "RemoveContainer" containerID="6e1220f05b516229b945ce6ad87ff9617b4e5a496cf0806562e0602335e6f092" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.674616 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"197a7c87-0333-4170-b28b-df07e57347fc","Type":"ContainerStarted","Data":"a652955ada0e39b93df2f015ac2eadffef3f3543e7b500198fee1dd99115bd81"} Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.697090 4937 scope.go:117] "RemoveContainer" containerID="81f89b2c2a68d55c05964191735c5787af8dbacc9320d0fab4fbce2e13c1458b" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.736617 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.784076 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.827587 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:13:59 crc kubenswrapper[4937]: E0225 16:13:59.828045 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c021e5b5-9038-4a91-8785-7461a1d3c981" containerName="glance-log" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.828062 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="c021e5b5-9038-4a91-8785-7461a1d3c981" containerName="glance-log" Feb 25 16:13:59 crc kubenswrapper[4937]: E0225 16:13:59.828100 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c021e5b5-9038-4a91-8785-7461a1d3c981" containerName="glance-httpd" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.828106 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="c021e5b5-9038-4a91-8785-7461a1d3c981" containerName="glance-httpd" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.828282 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="c021e5b5-9038-4a91-8785-7461a1d3c981" containerName="glance-httpd" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.828311 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="c021e5b5-9038-4a91-8785-7461a1d3c981" containerName="glance-log" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.829437 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.833036 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.833208 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.840585 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.930277 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.961447 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55cf40d2-3819-46a7-b9c1-aad7f3a65542-logs\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.961581 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cf40d2-3819-46a7-b9c1-aad7f3a65542-scripts\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.961839 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.961985 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cf40d2-3819-46a7-b9c1-aad7f3a65542-config-data\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.962129 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55cf40d2-3819-46a7-b9c1-aad7f3a65542-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.962173 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ptpv\" (UniqueName: \"kubernetes.io/projected/55cf40d2-3819-46a7-b9c1-aad7f3a65542-kube-api-access-6ptpv\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.962220 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cf40d2-3819-46a7-b9c1-aad7f3a65542-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:13:59 crc kubenswrapper[4937]: I0225 16:13:59.962264 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55cf40d2-3819-46a7-b9c1-aad7f3a65542-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.066358 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55cf40d2-3819-46a7-b9c1-aad7f3a65542-logs\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.066417 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cf40d2-3819-46a7-b9c1-aad7f3a65542-scripts\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.066912 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55cf40d2-3819-46a7-b9c1-aad7f3a65542-logs\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.066603 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.067333 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cf40d2-3819-46a7-b9c1-aad7f3a65542-config-data\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.067453 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55cf40d2-3819-46a7-b9c1-aad7f3a65542-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.067485 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ptpv\" (UniqueName: \"kubernetes.io/projected/55cf40d2-3819-46a7-b9c1-aad7f3a65542-kube-api-access-6ptpv\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.067536 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cf40d2-3819-46a7-b9c1-aad7f3a65542-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.067560 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55cf40d2-3819-46a7-b9c1-aad7f3a65542-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.067795 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55cf40d2-3819-46a7-b9c1-aad7f3a65542-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.071049 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.071309 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/53a683be7e1f0cfe8980b8900c6ef26fb8068fb3b32445402dc99b2c4e60848d/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.076125 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cf40d2-3819-46a7-b9c1-aad7f3a65542-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.080360 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55cf40d2-3819-46a7-b9c1-aad7f3a65542-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.080908 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cf40d2-3819-46a7-b9c1-aad7f3a65542-scripts\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.084130 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cf40d2-3819-46a7-b9c1-aad7f3a65542-config-data\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.093325 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ptpv\" (UniqueName: \"kubernetes.io/projected/55cf40d2-3819-46a7-b9c1-aad7f3a65542-kube-api-access-6ptpv\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.121784 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a6cbd030-8970-4e53-96ea-5336de37e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6cbd030-8970-4e53-96ea-5336de37e582\") pod \"glance-default-internal-api-0\" (UID: \"55cf40d2-3819-46a7-b9c1-aad7f3a65542\") " pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.144506 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533934-9znz5"] Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.145943 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533934-9znz5" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.148049 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.148201 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.148345 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.153443 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533934-9znz5"] Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.167089 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.270437 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbrpw\" (UniqueName: \"kubernetes.io/projected/b10bad36-94f5-475c-9e2d-c99fcac85f6a-kube-api-access-vbrpw\") pod \"auto-csr-approver-29533934-9znz5\" (UID: \"b10bad36-94f5-475c-9e2d-c99fcac85f6a\") " pod="openshift-infra/auto-csr-approver-29533934-9znz5" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.376104 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbrpw\" (UniqueName: \"kubernetes.io/projected/b10bad36-94f5-475c-9e2d-c99fcac85f6a-kube-api-access-vbrpw\") pod \"auto-csr-approver-29533934-9znz5\" (UID: \"b10bad36-94f5-475c-9e2d-c99fcac85f6a\") " pod="openshift-infra/auto-csr-approver-29533934-9znz5" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.396453 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbrpw\" (UniqueName: \"kubernetes.io/projected/b10bad36-94f5-475c-9e2d-c99fcac85f6a-kube-api-access-vbrpw\") pod \"auto-csr-approver-29533934-9znz5\" (UID: \"b10bad36-94f5-475c-9e2d-c99fcac85f6a\") " pod="openshift-infra/auto-csr-approver-29533934-9znz5" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.469171 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533934-9znz5" Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.728807 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c50d4693-04e4-40a4-a07d-9475ce9b0125","Type":"ContainerStarted","Data":"7fb58539d40cf8d7ecbe4901a97e03ebefe0194fdecd71af1282971cdcd1f958"} Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.786690 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"197a7c87-0333-4170-b28b-df07e57347fc","Type":"ContainerStarted","Data":"6dd631df97f032f795486d71e280946fdc1d3b6f6aecf1ae8a8e1456445e8dba"} Feb 25 16:14:00 crc kubenswrapper[4937]: I0225 16:14:00.917060 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 16:14:01 crc kubenswrapper[4937]: I0225 16:14:01.128612 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533934-9znz5"] Feb 25 16:14:01 crc kubenswrapper[4937]: I0225 16:14:01.427850 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c021e5b5-9038-4a91-8785-7461a1d3c981" path="/var/lib/kubelet/pods/c021e5b5-9038-4a91-8785-7461a1d3c981/volumes" Feb 25 16:14:01 crc kubenswrapper[4937]: I0225 16:14:01.824987 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"197a7c87-0333-4170-b28b-df07e57347fc","Type":"ContainerStarted","Data":"07872ef5d3d2f5cb01e72eac996f68e3db8f918906fb3a96d87e5b891b43cf3f"} Feb 25 16:14:01 crc kubenswrapper[4937]: I0225 16:14:01.835675 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533934-9znz5" event={"ID":"b10bad36-94f5-475c-9e2d-c99fcac85f6a","Type":"ContainerStarted","Data":"d426c71a10a2d8e9e1d1128d8314e88e8ac3bd53769d628125f1bd653502fb7c"} Feb 25 16:14:01 crc kubenswrapper[4937]: I0225 16:14:01.860765 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c50d4693-04e4-40a4-a07d-9475ce9b0125","Type":"ContainerStarted","Data":"d8e63b1c827fa29146d40580c12fba8a9842ae5fe9dc7d510d245df11af69065"} Feb 25 16:14:01 crc kubenswrapper[4937]: I0225 16:14:01.869664 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"55cf40d2-3819-46a7-b9c1-aad7f3a65542","Type":"ContainerStarted","Data":"f49942aad25b4fa12374e3150f4500b316844f4f52d099b10ab8ae3ac80c83e1"} Feb 25 16:14:02 crc kubenswrapper[4937]: I0225 16:14:02.904983 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"55cf40d2-3819-46a7-b9c1-aad7f3a65542","Type":"ContainerStarted","Data":"7a9080b7ceeb9f10b33a2899a8f346fab44b3c7d6c253abf77ec442e73143a9a"} Feb 25 16:14:02 crc kubenswrapper[4937]: I0225 16:14:02.913686 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c50d4693-04e4-40a4-a07d-9475ce9b0125","Type":"ContainerStarted","Data":"970cba147070f6708c0c7ddcc785b2b95deaa149017fc45e4d94827ed668433d"} Feb 25 16:14:02 crc kubenswrapper[4937]: I0225 16:14:02.943208 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.943189435 podStartE2EDuration="4.943189435s" podCreationTimestamp="2026-02-25 16:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:14:02.935886342 +0000 UTC m=+1693.949278242" watchObservedRunningTime="2026-02-25 16:14:02.943189435 +0000 UTC m=+1693.956581325" Feb 25 16:14:03 crc kubenswrapper[4937]: I0225 16:14:03.809742 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-api-0" podUID="2577b339-c9c0-4e63-afa1-c0b2fb7177b4" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.200:8889/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 16:14:03 crc kubenswrapper[4937]: I0225 16:14:03.834770 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cloudkitty-api-0" podUID="2577b339-c9c0-4e63-afa1-c0b2fb7177b4" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.200:8889/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 16:14:03 crc kubenswrapper[4937]: I0225 16:14:03.924259 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"55cf40d2-3819-46a7-b9c1-aad7f3a65542","Type":"ContainerStarted","Data":"acb6ecbde3ea776d1ce2f461ce3a83d8cc2af6d35b65d1b048c2f035121b70dd"} Feb 25 16:14:03 crc kubenswrapper[4937]: I0225 16:14:03.926654 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"197a7c87-0333-4170-b28b-df07e57347fc","Type":"ContainerStarted","Data":"67da02249cf972ed9bb63aa7e1f156b380848f5082c4976fa5cd07296ba5fc06"} Feb 25 16:14:03 crc kubenswrapper[4937]: I0225 16:14:03.928308 4937 generic.go:334] "Generic (PLEG): container finished" podID="b10bad36-94f5-475c-9e2d-c99fcac85f6a" containerID="bddb6db6b71410649176b95603e990a6c0ec09b56b153e0fe57e3a53f5a7b3ef" exitCode=0 Feb 25 16:14:03 crc kubenswrapper[4937]: I0225 16:14:03.928345 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533934-9znz5" event={"ID":"b10bad36-94f5-475c-9e2d-c99fcac85f6a","Type":"ContainerDied","Data":"bddb6db6b71410649176b95603e990a6c0ec09b56b153e0fe57e3a53f5a7b3ef"} Feb 25 16:14:03 crc kubenswrapper[4937]: I0225 16:14:03.978595 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.9785742509999995 podStartE2EDuration="4.978574251s" podCreationTimestamp="2026-02-25 16:13:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:14:03.95894035 +0000 UTC m=+1694.972332250" watchObservedRunningTime="2026-02-25 16:14:03.978574251 +0000 UTC m=+1694.991966141" Feb 25 16:14:05 crc kubenswrapper[4937]: I0225 16:14:05.439033 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533934-9znz5" Feb 25 16:14:05 crc kubenswrapper[4937]: I0225 16:14:05.513233 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbrpw\" (UniqueName: \"kubernetes.io/projected/b10bad36-94f5-475c-9e2d-c99fcac85f6a-kube-api-access-vbrpw\") pod \"b10bad36-94f5-475c-9e2d-c99fcac85f6a\" (UID: \"b10bad36-94f5-475c-9e2d-c99fcac85f6a\") " Feb 25 16:14:05 crc kubenswrapper[4937]: I0225 16:14:05.518028 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b10bad36-94f5-475c-9e2d-c99fcac85f6a-kube-api-access-vbrpw" (OuterVolumeSpecName: "kube-api-access-vbrpw") pod "b10bad36-94f5-475c-9e2d-c99fcac85f6a" (UID: "b10bad36-94f5-475c-9e2d-c99fcac85f6a"). InnerVolumeSpecName "kube-api-access-vbrpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:14:05 crc kubenswrapper[4937]: I0225 16:14:05.615949 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbrpw\" (UniqueName: \"kubernetes.io/projected/b10bad36-94f5-475c-9e2d-c99fcac85f6a-kube-api-access-vbrpw\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:05 crc kubenswrapper[4937]: I0225 16:14:05.758090 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 25 16:14:05 crc kubenswrapper[4937]: I0225 16:14:05.952182 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533934-9znz5" Feb 25 16:14:05 crc kubenswrapper[4937]: I0225 16:14:05.952200 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533934-9znz5" event={"ID":"b10bad36-94f5-475c-9e2d-c99fcac85f6a","Type":"ContainerDied","Data":"d426c71a10a2d8e9e1d1128d8314e88e8ac3bd53769d628125f1bd653502fb7c"} Feb 25 16:14:05 crc kubenswrapper[4937]: I0225 16:14:05.952266 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d426c71a10a2d8e9e1d1128d8314e88e8ac3bd53769d628125f1bd653502fb7c" Feb 25 16:14:05 crc kubenswrapper[4937]: I0225 16:14:05.955217 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"197a7c87-0333-4170-b28b-df07e57347fc","Type":"ContainerStarted","Data":"7671bf736a570eaa37fceb10cfdccca3982046b73e0e1115851165101801a9f7"} Feb 25 16:14:05 crc kubenswrapper[4937]: I0225 16:14:05.955434 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 16:14:05 crc kubenswrapper[4937]: I0225 16:14:05.982463 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7610316040000003 podStartE2EDuration="8.982436447s" podCreationTimestamp="2026-02-25 16:13:57 +0000 UTC" firstStartedPulling="2026-02-25 16:13:58.883312717 +0000 UTC m=+1689.896704607" lastFinishedPulling="2026-02-25 16:14:05.10471756 +0000 UTC m=+1696.118109450" observedRunningTime="2026-02-25 16:14:05.975667687 +0000 UTC m=+1696.989059577" watchObservedRunningTime="2026-02-25 16:14:05.982436447 +0000 UTC m=+1696.995828337" Feb 25 16:14:06 crc kubenswrapper[4937]: I0225 16:14:06.510446 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533928-xf6hf"] Feb 25 16:14:06 crc kubenswrapper[4937]: I0225 16:14:06.524188 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533928-xf6hf"] Feb 25 16:14:07 crc kubenswrapper[4937]: I0225 16:14:07.383564 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="934150ac-0fe5-4dad-ba78-7fdc77f53fb5" path="/var/lib/kubelet/pods/934150ac-0fe5-4dad-ba78-7fdc77f53fb5/volumes" Feb 25 16:14:08 crc kubenswrapper[4937]: I0225 16:14:08.368109 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:14:08 crc kubenswrapper[4937]: E0225 16:14:08.368707 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:14:09 crc kubenswrapper[4937]: I0225 16:14:09.137643 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 25 16:14:09 crc kubenswrapper[4937]: I0225 16:14:09.137682 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 25 16:14:09 crc kubenswrapper[4937]: I0225 16:14:09.172955 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 25 16:14:09 crc kubenswrapper[4937]: I0225 16:14:09.195919 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 25 16:14:09 crc kubenswrapper[4937]: I0225 16:14:09.995565 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 25 16:14:09 crc kubenswrapper[4937]: I0225 16:14:09.996048 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 25 16:14:10 crc kubenswrapper[4937]: I0225 16:14:10.168751 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 25 16:14:10 crc kubenswrapper[4937]: I0225 16:14:10.168827 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 25 16:14:10 crc kubenswrapper[4937]: I0225 16:14:10.214959 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 25 16:14:10 crc kubenswrapper[4937]: I0225 16:14:10.242236 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 25 16:14:11 crc kubenswrapper[4937]: I0225 16:14:11.005976 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 25 16:14:11 crc kubenswrapper[4937]: I0225 16:14:11.006285 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 25 16:14:11 crc kubenswrapper[4937]: I0225 16:14:11.927204 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 25 16:14:12 crc kubenswrapper[4937]: I0225 16:14:12.015159 4937 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 16:14:12 crc kubenswrapper[4937]: I0225 16:14:12.041597 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 25 16:14:12 crc kubenswrapper[4937]: I0225 16:14:12.948000 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 25 16:14:13 crc kubenswrapper[4937]: I0225 16:14:13.024208 4937 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 16:14:13 crc kubenswrapper[4937]: I0225 16:14:13.106722 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 25 16:14:17 crc kubenswrapper[4937]: I0225 16:14:17.064702 4937 generic.go:334] "Generic (PLEG): container finished" podID="90c2ee96-d6a2-4231-abc4-e9e186375ede" containerID="7f22c4ab562bfd37ac3a88a1518f26c12f915580c0e99b5e1d8cae72324a64de" exitCode=0 Feb 25 16:14:17 crc kubenswrapper[4937]: I0225 16:14:17.064802 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ln9pv" event={"ID":"90c2ee96-d6a2-4231-abc4-e9e186375ede","Type":"ContainerDied","Data":"7f22c4ab562bfd37ac3a88a1518f26c12f915580c0e99b5e1d8cae72324a64de"} Feb 25 16:14:18 crc kubenswrapper[4937]: I0225 16:14:18.550614 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:14:18 crc kubenswrapper[4937]: I0225 16:14:18.633820 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq7cf\" (UniqueName: \"kubernetes.io/projected/90c2ee96-d6a2-4231-abc4-e9e186375ede-kube-api-access-xq7cf\") pod \"90c2ee96-d6a2-4231-abc4-e9e186375ede\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " Feb 25 16:14:18 crc kubenswrapper[4937]: I0225 16:14:18.634353 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-combined-ca-bundle\") pod \"90c2ee96-d6a2-4231-abc4-e9e186375ede\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " Feb 25 16:14:18 crc kubenswrapper[4937]: I0225 16:14:18.634417 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-scripts\") pod \"90c2ee96-d6a2-4231-abc4-e9e186375ede\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " Feb 25 16:14:18 crc kubenswrapper[4937]: I0225 16:14:18.634792 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-config-data\") pod \"90c2ee96-d6a2-4231-abc4-e9e186375ede\" (UID: \"90c2ee96-d6a2-4231-abc4-e9e186375ede\") " Feb 25 16:14:18 crc kubenswrapper[4937]: I0225 16:14:18.640910 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90c2ee96-d6a2-4231-abc4-e9e186375ede-kube-api-access-xq7cf" (OuterVolumeSpecName: "kube-api-access-xq7cf") pod "90c2ee96-d6a2-4231-abc4-e9e186375ede" (UID: "90c2ee96-d6a2-4231-abc4-e9e186375ede"). InnerVolumeSpecName "kube-api-access-xq7cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:14:18 crc kubenswrapper[4937]: I0225 16:14:18.641190 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-scripts" (OuterVolumeSpecName: "scripts") pod "90c2ee96-d6a2-4231-abc4-e9e186375ede" (UID: "90c2ee96-d6a2-4231-abc4-e9e186375ede"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:18 crc kubenswrapper[4937]: I0225 16:14:18.668343 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90c2ee96-d6a2-4231-abc4-e9e186375ede" (UID: "90c2ee96-d6a2-4231-abc4-e9e186375ede"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:18 crc kubenswrapper[4937]: I0225 16:14:18.671037 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-config-data" (OuterVolumeSpecName: "config-data") pod "90c2ee96-d6a2-4231-abc4-e9e186375ede" (UID: "90c2ee96-d6a2-4231-abc4-e9e186375ede"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:18 crc kubenswrapper[4937]: I0225 16:14:18.737859 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq7cf\" (UniqueName: \"kubernetes.io/projected/90c2ee96-d6a2-4231-abc4-e9e186375ede-kube-api-access-xq7cf\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:18 crc kubenswrapper[4937]: I0225 16:14:18.738084 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:18 crc kubenswrapper[4937]: I0225 16:14:18.738183 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:18 crc kubenswrapper[4937]: I0225 16:14:18.738262 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c2ee96-d6a2-4231-abc4-e9e186375ede-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.105015 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ln9pv" event={"ID":"90c2ee96-d6a2-4231-abc4-e9e186375ede","Type":"ContainerDied","Data":"8549ccf4eab0082e1926b1f86ac8d108c3ac2e1093c208e5b1841c0ef9afc792"} Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.105066 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8549ccf4eab0082e1926b1f86ac8d108c3ac2e1093c208e5b1841c0ef9afc792" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.105166 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ln9pv" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.231032 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 16:14:19 crc kubenswrapper[4937]: E0225 16:14:19.231416 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90c2ee96-d6a2-4231-abc4-e9e186375ede" containerName="nova-cell0-conductor-db-sync" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.231431 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c2ee96-d6a2-4231-abc4-e9e186375ede" containerName="nova-cell0-conductor-db-sync" Feb 25 16:14:19 crc kubenswrapper[4937]: E0225 16:14:19.231461 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b10bad36-94f5-475c-9e2d-c99fcac85f6a" containerName="oc" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.231467 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="b10bad36-94f5-475c-9e2d-c99fcac85f6a" containerName="oc" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.231688 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="90c2ee96-d6a2-4231-abc4-e9e186375ede" containerName="nova-cell0-conductor-db-sync" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.231717 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="b10bad36-94f5-475c-9e2d-c99fcac85f6a" containerName="oc" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.232414 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.237301 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.237631 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-m7lvs" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.247305 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rzgl\" (UniqueName: \"kubernetes.io/projected/55869127-b091-47f7-a8ee-02fecf833efb-kube-api-access-6rzgl\") pod \"nova-cell0-conductor-0\" (UID: \"55869127-b091-47f7-a8ee-02fecf833efb\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.247419 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55869127-b091-47f7-a8ee-02fecf833efb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"55869127-b091-47f7-a8ee-02fecf833efb\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.247454 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55869127-b091-47f7-a8ee-02fecf833efb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"55869127-b091-47f7-a8ee-02fecf833efb\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.250857 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.349320 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rzgl\" (UniqueName: \"kubernetes.io/projected/55869127-b091-47f7-a8ee-02fecf833efb-kube-api-access-6rzgl\") pod \"nova-cell0-conductor-0\" (UID: \"55869127-b091-47f7-a8ee-02fecf833efb\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.349407 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55869127-b091-47f7-a8ee-02fecf833efb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"55869127-b091-47f7-a8ee-02fecf833efb\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.349437 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55869127-b091-47f7-a8ee-02fecf833efb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"55869127-b091-47f7-a8ee-02fecf833efb\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.354060 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55869127-b091-47f7-a8ee-02fecf833efb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"55869127-b091-47f7-a8ee-02fecf833efb\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.361520 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55869127-b091-47f7-a8ee-02fecf833efb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"55869127-b091-47f7-a8ee-02fecf833efb\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.368021 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rzgl\" (UniqueName: \"kubernetes.io/projected/55869127-b091-47f7-a8ee-02fecf833efb-kube-api-access-6rzgl\") pod \"nova-cell0-conductor-0\" (UID: \"55869127-b091-47f7-a8ee-02fecf833efb\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:19 crc kubenswrapper[4937]: I0225 16:14:19.553588 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:20 crc kubenswrapper[4937]: W0225 16:14:20.005867 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55869127_b091_47f7_a8ee_02fecf833efb.slice/crio-e87803f296d0aebdf36a2184d8e368ef4bfdbd47de8fa564c4350401f3d8aee4 WatchSource:0}: Error finding container e87803f296d0aebdf36a2184d8e368ef4bfdbd47de8fa564c4350401f3d8aee4: Status 404 returned error can't find the container with id e87803f296d0aebdf36a2184d8e368ef4bfdbd47de8fa564c4350401f3d8aee4 Feb 25 16:14:20 crc kubenswrapper[4937]: I0225 16:14:20.005986 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 16:14:20 crc kubenswrapper[4937]: I0225 16:14:20.116419 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"55869127-b091-47f7-a8ee-02fecf833efb","Type":"ContainerStarted","Data":"e87803f296d0aebdf36a2184d8e368ef4bfdbd47de8fa564c4350401f3d8aee4"} Feb 25 16:14:21 crc kubenswrapper[4937]: I0225 16:14:21.131807 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"55869127-b091-47f7-a8ee-02fecf833efb","Type":"ContainerStarted","Data":"8585d22d0eba7125e511cb2ff2a94a05ffe2245c517813bde0d79499f0211f9a"} Feb 25 16:14:21 crc kubenswrapper[4937]: I0225 16:14:21.132328 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:21 crc kubenswrapper[4937]: I0225 16:14:21.154137 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.154107831 podStartE2EDuration="2.154107831s" podCreationTimestamp="2026-02-25 16:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:14:21.152800268 +0000 UTC m=+1712.166192158" watchObservedRunningTime="2026-02-25 16:14:21.154107831 +0000 UTC m=+1712.167499741" Feb 25 16:14:21 crc kubenswrapper[4937]: I0225 16:14:21.431813 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:14:21 crc kubenswrapper[4937]: I0225 16:14:21.432474 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="ceilometer-central-agent" containerID="cri-o://6dd631df97f032f795486d71e280946fdc1d3b6f6aecf1ae8a8e1456445e8dba" gracePeriod=30 Feb 25 16:14:21 crc kubenswrapper[4937]: I0225 16:14:21.433478 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="ceilometer-notification-agent" containerID="cri-o://07872ef5d3d2f5cb01e72eac996f68e3db8f918906fb3a96d87e5b891b43cf3f" gracePeriod=30 Feb 25 16:14:21 crc kubenswrapper[4937]: I0225 16:14:21.433574 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="proxy-httpd" containerID="cri-o://7671bf736a570eaa37fceb10cfdccca3982046b73e0e1115851165101801a9f7" gracePeriod=30 Feb 25 16:14:21 crc kubenswrapper[4937]: I0225 16:14:21.433642 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="sg-core" containerID="cri-o://67da02249cf972ed9bb63aa7e1f156b380848f5082c4976fa5cd07296ba5fc06" gracePeriod=30 Feb 25 16:14:21 crc kubenswrapper[4937]: I0225 16:14:21.445706 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.212:3000/\": EOF" Feb 25 16:14:22 crc kubenswrapper[4937]: I0225 16:14:22.144403 4937 generic.go:334] "Generic (PLEG): container finished" podID="197a7c87-0333-4170-b28b-df07e57347fc" containerID="7671bf736a570eaa37fceb10cfdccca3982046b73e0e1115851165101801a9f7" exitCode=0 Feb 25 16:14:22 crc kubenswrapper[4937]: I0225 16:14:22.144439 4937 generic.go:334] "Generic (PLEG): container finished" podID="197a7c87-0333-4170-b28b-df07e57347fc" containerID="67da02249cf972ed9bb63aa7e1f156b380848f5082c4976fa5cd07296ba5fc06" exitCode=2 Feb 25 16:14:22 crc kubenswrapper[4937]: I0225 16:14:22.144450 4937 generic.go:334] "Generic (PLEG): container finished" podID="197a7c87-0333-4170-b28b-df07e57347fc" containerID="6dd631df97f032f795486d71e280946fdc1d3b6f6aecf1ae8a8e1456445e8dba" exitCode=0 Feb 25 16:14:22 crc kubenswrapper[4937]: I0225 16:14:22.144585 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"197a7c87-0333-4170-b28b-df07e57347fc","Type":"ContainerDied","Data":"7671bf736a570eaa37fceb10cfdccca3982046b73e0e1115851165101801a9f7"} Feb 25 16:14:22 crc kubenswrapper[4937]: I0225 16:14:22.144623 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"197a7c87-0333-4170-b28b-df07e57347fc","Type":"ContainerDied","Data":"67da02249cf972ed9bb63aa7e1f156b380848f5082c4976fa5cd07296ba5fc06"} Feb 25 16:14:22 crc kubenswrapper[4937]: I0225 16:14:22.144636 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"197a7c87-0333-4170-b28b-df07e57347fc","Type":"ContainerDied","Data":"6dd631df97f032f795486d71e280946fdc1d3b6f6aecf1ae8a8e1456445e8dba"} Feb 25 16:14:23 crc kubenswrapper[4937]: I0225 16:14:23.368075 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:14:23 crc kubenswrapper[4937]: E0225 16:14:23.368684 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.038102 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.038663 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="55869127-b091-47f7-a8ee-02fecf833efb" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8585d22d0eba7125e511cb2ff2a94a05ffe2245c517813bde0d79499f0211f9a" gracePeriod=30 Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.714101 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.758061 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-combined-ca-bundle\") pod \"197a7c87-0333-4170-b28b-df07e57347fc\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.758153 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-sg-core-conf-yaml\") pod \"197a7c87-0333-4170-b28b-df07e57347fc\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.758228 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-scripts\") pod \"197a7c87-0333-4170-b28b-df07e57347fc\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.758288 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzflf\" (UniqueName: \"kubernetes.io/projected/197a7c87-0333-4170-b28b-df07e57347fc-kube-api-access-gzflf\") pod \"197a7c87-0333-4170-b28b-df07e57347fc\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.758351 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/197a7c87-0333-4170-b28b-df07e57347fc-log-httpd\") pod \"197a7c87-0333-4170-b28b-df07e57347fc\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.758373 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-config-data\") pod \"197a7c87-0333-4170-b28b-df07e57347fc\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.758397 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/197a7c87-0333-4170-b28b-df07e57347fc-run-httpd\") pod \"197a7c87-0333-4170-b28b-df07e57347fc\" (UID: \"197a7c87-0333-4170-b28b-df07e57347fc\") " Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.759188 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/197a7c87-0333-4170-b28b-df07e57347fc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "197a7c87-0333-4170-b28b-df07e57347fc" (UID: "197a7c87-0333-4170-b28b-df07e57347fc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.759815 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/197a7c87-0333-4170-b28b-df07e57347fc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "197a7c87-0333-4170-b28b-df07e57347fc" (UID: "197a7c87-0333-4170-b28b-df07e57347fc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.773727 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/197a7c87-0333-4170-b28b-df07e57347fc-kube-api-access-gzflf" (OuterVolumeSpecName: "kube-api-access-gzflf") pod "197a7c87-0333-4170-b28b-df07e57347fc" (UID: "197a7c87-0333-4170-b28b-df07e57347fc"). InnerVolumeSpecName "kube-api-access-gzflf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.793715 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-scripts" (OuterVolumeSpecName: "scripts") pod "197a7c87-0333-4170-b28b-df07e57347fc" (UID: "197a7c87-0333-4170-b28b-df07e57347fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.867423 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.867474 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzflf\" (UniqueName: \"kubernetes.io/projected/197a7c87-0333-4170-b28b-df07e57347fc-kube-api-access-gzflf\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.867505 4937 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/197a7c87-0333-4170-b28b-df07e57347fc-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.867514 4937 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/197a7c87-0333-4170-b28b-df07e57347fc-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.882479 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "197a7c87-0333-4170-b28b-df07e57347fc" (UID: "197a7c87-0333-4170-b28b-df07e57347fc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.921096 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "197a7c87-0333-4170-b28b-df07e57347fc" (UID: "197a7c87-0333-4170-b28b-df07e57347fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.952857 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-config-data" (OuterVolumeSpecName: "config-data") pod "197a7c87-0333-4170-b28b-df07e57347fc" (UID: "197a7c87-0333-4170-b28b-df07e57347fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.969845 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.969890 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:24 crc kubenswrapper[4937]: I0225 16:14:24.969906 4937 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/197a7c87-0333-4170-b28b-df07e57347fc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.179438 4937 generic.go:334] "Generic (PLEG): container finished" podID="197a7c87-0333-4170-b28b-df07e57347fc" containerID="07872ef5d3d2f5cb01e72eac996f68e3db8f918906fb3a96d87e5b891b43cf3f" exitCode=0 Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.179561 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"197a7c87-0333-4170-b28b-df07e57347fc","Type":"ContainerDied","Data":"07872ef5d3d2f5cb01e72eac996f68e3db8f918906fb3a96d87e5b891b43cf3f"} Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.179593 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"197a7c87-0333-4170-b28b-df07e57347fc","Type":"ContainerDied","Data":"a652955ada0e39b93df2f015ac2eadffef3f3543e7b500198fee1dd99115bd81"} Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.179613 4937 scope.go:117] "RemoveContainer" containerID="7671bf736a570eaa37fceb10cfdccca3982046b73e0e1115851165101801a9f7" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.179655 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.201992 4937 scope.go:117] "RemoveContainer" containerID="67da02249cf972ed9bb63aa7e1f156b380848f5082c4976fa5cd07296ba5fc06" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.231954 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.239215 4937 scope.go:117] "RemoveContainer" containerID="07872ef5d3d2f5cb01e72eac996f68e3db8f918906fb3a96d87e5b891b43cf3f" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.244431 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.260460 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:14:25 crc kubenswrapper[4937]: E0225 16:14:25.261626 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="sg-core" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.261656 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="sg-core" Feb 25 16:14:25 crc kubenswrapper[4937]: E0225 16:14:25.261681 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="ceilometer-central-agent" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.261692 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="ceilometer-central-agent" Feb 25 16:14:25 crc kubenswrapper[4937]: E0225 16:14:25.261755 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="proxy-httpd" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.261766 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="proxy-httpd" Feb 25 16:14:25 crc kubenswrapper[4937]: E0225 16:14:25.261801 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="ceilometer-notification-agent" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.261808 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="ceilometer-notification-agent" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.262276 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="ceilometer-notification-agent" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.262324 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="ceilometer-central-agent" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.262350 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="sg-core" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.262380 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="197a7c87-0333-4170-b28b-df07e57347fc" containerName="proxy-httpd" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.275826 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.280136 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph84v\" (UniqueName: \"kubernetes.io/projected/5e125caa-a643-4a1e-81bc-e9983b90b640-kube-api-access-ph84v\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.280211 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-config-data\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.280272 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.280397 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e125caa-a643-4a1e-81bc-e9983b90b640-log-httpd\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.280466 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-scripts\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.280525 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.280562 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e125caa-a643-4a1e-81bc-e9983b90b640-run-httpd\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.280967 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.286497 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.292832 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.344232 4937 scope.go:117] "RemoveContainer" containerID="6dd631df97f032f795486d71e280946fdc1d3b6f6aecf1ae8a8e1456445e8dba" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.384207 4937 scope.go:117] "RemoveContainer" containerID="7671bf736a570eaa37fceb10cfdccca3982046b73e0e1115851165101801a9f7" Feb 25 16:14:25 crc kubenswrapper[4937]: E0225 16:14:25.392693 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7671bf736a570eaa37fceb10cfdccca3982046b73e0e1115851165101801a9f7\": container with ID starting with 7671bf736a570eaa37fceb10cfdccca3982046b73e0e1115851165101801a9f7 not found: ID does not exist" containerID="7671bf736a570eaa37fceb10cfdccca3982046b73e0e1115851165101801a9f7" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.392741 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7671bf736a570eaa37fceb10cfdccca3982046b73e0e1115851165101801a9f7"} err="failed to get container status \"7671bf736a570eaa37fceb10cfdccca3982046b73e0e1115851165101801a9f7\": rpc error: code = NotFound desc = could not find container \"7671bf736a570eaa37fceb10cfdccca3982046b73e0e1115851165101801a9f7\": container with ID starting with 7671bf736a570eaa37fceb10cfdccca3982046b73e0e1115851165101801a9f7 not found: ID does not exist" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.392772 4937 scope.go:117] "RemoveContainer" containerID="67da02249cf972ed9bb63aa7e1f156b380848f5082c4976fa5cd07296ba5fc06" Feb 25 16:14:25 crc kubenswrapper[4937]: E0225 16:14:25.393193 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67da02249cf972ed9bb63aa7e1f156b380848f5082c4976fa5cd07296ba5fc06\": container with ID starting with 67da02249cf972ed9bb63aa7e1f156b380848f5082c4976fa5cd07296ba5fc06 not found: ID does not exist" containerID="67da02249cf972ed9bb63aa7e1f156b380848f5082c4976fa5cd07296ba5fc06" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.393221 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67da02249cf972ed9bb63aa7e1f156b380848f5082c4976fa5cd07296ba5fc06"} err="failed to get container status \"67da02249cf972ed9bb63aa7e1f156b380848f5082c4976fa5cd07296ba5fc06\": rpc error: code = NotFound desc = could not find container \"67da02249cf972ed9bb63aa7e1f156b380848f5082c4976fa5cd07296ba5fc06\": container with ID starting with 67da02249cf972ed9bb63aa7e1f156b380848f5082c4976fa5cd07296ba5fc06 not found: ID does not exist" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.393237 4937 scope.go:117] "RemoveContainer" containerID="07872ef5d3d2f5cb01e72eac996f68e3db8f918906fb3a96d87e5b891b43cf3f" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.394014 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e125caa-a643-4a1e-81bc-e9983b90b640-log-httpd\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.394143 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-scripts\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.394187 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.394228 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e125caa-a643-4a1e-81bc-e9983b90b640-run-httpd\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.394426 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph84v\" (UniqueName: \"kubernetes.io/projected/5e125caa-a643-4a1e-81bc-e9983b90b640-kube-api-access-ph84v\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.394477 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-config-data\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.394585 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.396995 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e125caa-a643-4a1e-81bc-e9983b90b640-run-httpd\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.397212 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e125caa-a643-4a1e-81bc-e9983b90b640-log-httpd\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: E0225 16:14:25.397699 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07872ef5d3d2f5cb01e72eac996f68e3db8f918906fb3a96d87e5b891b43cf3f\": container with ID starting with 07872ef5d3d2f5cb01e72eac996f68e3db8f918906fb3a96d87e5b891b43cf3f not found: ID does not exist" containerID="07872ef5d3d2f5cb01e72eac996f68e3db8f918906fb3a96d87e5b891b43cf3f" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.397757 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07872ef5d3d2f5cb01e72eac996f68e3db8f918906fb3a96d87e5b891b43cf3f"} err="failed to get container status \"07872ef5d3d2f5cb01e72eac996f68e3db8f918906fb3a96d87e5b891b43cf3f\": rpc error: code = NotFound desc = could not find container \"07872ef5d3d2f5cb01e72eac996f68e3db8f918906fb3a96d87e5b891b43cf3f\": container with ID starting with 07872ef5d3d2f5cb01e72eac996f68e3db8f918906fb3a96d87e5b891b43cf3f not found: ID does not exist" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.397791 4937 scope.go:117] "RemoveContainer" containerID="6dd631df97f032f795486d71e280946fdc1d3b6f6aecf1ae8a8e1456445e8dba" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.399842 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.404060 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-scripts\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.414894 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-config-data\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.416886 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.417266 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="197a7c87-0333-4170-b28b-df07e57347fc" path="/var/lib/kubelet/pods/197a7c87-0333-4170-b28b-df07e57347fc/volumes" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.418253 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:14:25 crc kubenswrapper[4937]: E0225 16:14:25.418781 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-ph84v], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="5e125caa-a643-4a1e-81bc-e9983b90b640" Feb 25 16:14:25 crc kubenswrapper[4937]: E0225 16:14:25.419930 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dd631df97f032f795486d71e280946fdc1d3b6f6aecf1ae8a8e1456445e8dba\": container with ID starting with 6dd631df97f032f795486d71e280946fdc1d3b6f6aecf1ae8a8e1456445e8dba not found: ID does not exist" containerID="6dd631df97f032f795486d71e280946fdc1d3b6f6aecf1ae8a8e1456445e8dba" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.419994 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dd631df97f032f795486d71e280946fdc1d3b6f6aecf1ae8a8e1456445e8dba"} err="failed to get container status \"6dd631df97f032f795486d71e280946fdc1d3b6f6aecf1ae8a8e1456445e8dba\": rpc error: code = NotFound desc = could not find container \"6dd631df97f032f795486d71e280946fdc1d3b6f6aecf1ae8a8e1456445e8dba\": container with ID starting with 6dd631df97f032f795486d71e280946fdc1d3b6f6aecf1ae8a8e1456445e8dba not found: ID does not exist" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.421384 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph84v\" (UniqueName: \"kubernetes.io/projected/5e125caa-a643-4a1e-81bc-e9983b90b640-kube-api-access-ph84v\") pod \"ceilometer-0\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " pod="openstack/ceilometer-0" Feb 25 16:14:25 crc kubenswrapper[4937]: I0225 16:14:25.997579 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.112774 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rzgl\" (UniqueName: \"kubernetes.io/projected/55869127-b091-47f7-a8ee-02fecf833efb-kube-api-access-6rzgl\") pod \"55869127-b091-47f7-a8ee-02fecf833efb\" (UID: \"55869127-b091-47f7-a8ee-02fecf833efb\") " Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.112875 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55869127-b091-47f7-a8ee-02fecf833efb-config-data\") pod \"55869127-b091-47f7-a8ee-02fecf833efb\" (UID: \"55869127-b091-47f7-a8ee-02fecf833efb\") " Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.113164 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55869127-b091-47f7-a8ee-02fecf833efb-combined-ca-bundle\") pod \"55869127-b091-47f7-a8ee-02fecf833efb\" (UID: \"55869127-b091-47f7-a8ee-02fecf833efb\") " Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.120465 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55869127-b091-47f7-a8ee-02fecf833efb-kube-api-access-6rzgl" (OuterVolumeSpecName: "kube-api-access-6rzgl") pod "55869127-b091-47f7-a8ee-02fecf833efb" (UID: "55869127-b091-47f7-a8ee-02fecf833efb"). InnerVolumeSpecName "kube-api-access-6rzgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.151517 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55869127-b091-47f7-a8ee-02fecf833efb-config-data" (OuterVolumeSpecName: "config-data") pod "55869127-b091-47f7-a8ee-02fecf833efb" (UID: "55869127-b091-47f7-a8ee-02fecf833efb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.165470 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55869127-b091-47f7-a8ee-02fecf833efb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55869127-b091-47f7-a8ee-02fecf833efb" (UID: "55869127-b091-47f7-a8ee-02fecf833efb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.199273 4937 generic.go:334] "Generic (PLEG): container finished" podID="55869127-b091-47f7-a8ee-02fecf833efb" containerID="8585d22d0eba7125e511cb2ff2a94a05ffe2245c517813bde0d79499f0211f9a" exitCode=0 Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.199457 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.200098 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.200689 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"55869127-b091-47f7-a8ee-02fecf833efb","Type":"ContainerDied","Data":"8585d22d0eba7125e511cb2ff2a94a05ffe2245c517813bde0d79499f0211f9a"} Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.200736 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"55869127-b091-47f7-a8ee-02fecf833efb","Type":"ContainerDied","Data":"e87803f296d0aebdf36a2184d8e368ef4bfdbd47de8fa564c4350401f3d8aee4"} Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.200792 4937 scope.go:117] "RemoveContainer" containerID="8585d22d0eba7125e511cb2ff2a94a05ffe2245c517813bde0d79499f0211f9a" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.214342 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.215951 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55869127-b091-47f7-a8ee-02fecf833efb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.215999 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rzgl\" (UniqueName: \"kubernetes.io/projected/55869127-b091-47f7-a8ee-02fecf833efb-kube-api-access-6rzgl\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.216019 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55869127-b091-47f7-a8ee-02fecf833efb-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.239384 4937 scope.go:117] "RemoveContainer" containerID="8585d22d0eba7125e511cb2ff2a94a05ffe2245c517813bde0d79499f0211f9a" Feb 25 16:14:26 crc kubenswrapper[4937]: E0225 16:14:26.241054 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8585d22d0eba7125e511cb2ff2a94a05ffe2245c517813bde0d79499f0211f9a\": container with ID starting with 8585d22d0eba7125e511cb2ff2a94a05ffe2245c517813bde0d79499f0211f9a not found: ID does not exist" containerID="8585d22d0eba7125e511cb2ff2a94a05ffe2245c517813bde0d79499f0211f9a" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.241099 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8585d22d0eba7125e511cb2ff2a94a05ffe2245c517813bde0d79499f0211f9a"} err="failed to get container status \"8585d22d0eba7125e511cb2ff2a94a05ffe2245c517813bde0d79499f0211f9a\": rpc error: code = NotFound desc = could not find container \"8585d22d0eba7125e511cb2ff2a94a05ffe2245c517813bde0d79499f0211f9a\": container with ID starting with 8585d22d0eba7125e511cb2ff2a94a05ffe2245c517813bde0d79499f0211f9a not found: ID does not exist" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.265305 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.287854 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.303179 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 16:14:26 crc kubenswrapper[4937]: E0225 16:14:26.303970 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55869127-b091-47f7-a8ee-02fecf833efb" containerName="nova-cell0-conductor-conductor" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.304002 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="55869127-b091-47f7-a8ee-02fecf833efb" containerName="nova-cell0-conductor-conductor" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.304415 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="55869127-b091-47f7-a8ee-02fecf833efb" containerName="nova-cell0-conductor-conductor" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.305633 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.308234 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-m7lvs" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.308676 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.317441 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e125caa-a643-4a1e-81bc-e9983b90b640-log-httpd\") pod \"5e125caa-a643-4a1e-81bc-e9983b90b640\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.317549 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-config-data\") pod \"5e125caa-a643-4a1e-81bc-e9983b90b640\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.317599 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph84v\" (UniqueName: \"kubernetes.io/projected/5e125caa-a643-4a1e-81bc-e9983b90b640-kube-api-access-ph84v\") pod \"5e125caa-a643-4a1e-81bc-e9983b90b640\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.317663 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-scripts\") pod \"5e125caa-a643-4a1e-81bc-e9983b90b640\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.318330 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e125caa-a643-4a1e-81bc-e9983b90b640-run-httpd\") pod \"5e125caa-a643-4a1e-81bc-e9983b90b640\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.318374 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-sg-core-conf-yaml\") pod \"5e125caa-a643-4a1e-81bc-e9983b90b640\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.318527 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-combined-ca-bundle\") pod \"5e125caa-a643-4a1e-81bc-e9983b90b640\" (UID: \"5e125caa-a643-4a1e-81bc-e9983b90b640\") " Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.318479 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e125caa-a643-4a1e-81bc-e9983b90b640-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5e125caa-a643-4a1e-81bc-e9983b90b640" (UID: "5e125caa-a643-4a1e-81bc-e9983b90b640"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.318753 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69df688-29bc-47e8-98ef-56b506f9e7c1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c69df688-29bc-47e8-98ef-56b506f9e7c1\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.318808 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69df688-29bc-47e8-98ef-56b506f9e7c1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c69df688-29bc-47e8-98ef-56b506f9e7c1\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.318826 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz6s6\" (UniqueName: \"kubernetes.io/projected/c69df688-29bc-47e8-98ef-56b506f9e7c1-kube-api-access-nz6s6\") pod \"nova-cell0-conductor-0\" (UID: \"c69df688-29bc-47e8-98ef-56b506f9e7c1\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.318831 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e125caa-a643-4a1e-81bc-e9983b90b640-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5e125caa-a643-4a1e-81bc-e9983b90b640" (UID: "5e125caa-a643-4a1e-81bc-e9983b90b640"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.319149 4937 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e125caa-a643-4a1e-81bc-e9983b90b640-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.319166 4937 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e125caa-a643-4a1e-81bc-e9983b90b640-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.325938 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.326228 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-scripts" (OuterVolumeSpecName: "scripts") pod "5e125caa-a643-4a1e-81bc-e9983b90b640" (UID: "5e125caa-a643-4a1e-81bc-e9983b90b640"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.326756 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-config-data" (OuterVolumeSpecName: "config-data") pod "5e125caa-a643-4a1e-81bc-e9983b90b640" (UID: "5e125caa-a643-4a1e-81bc-e9983b90b640"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.327030 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e125caa-a643-4a1e-81bc-e9983b90b640" (UID: "5e125caa-a643-4a1e-81bc-e9983b90b640"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.329096 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e125caa-a643-4a1e-81bc-e9983b90b640-kube-api-access-ph84v" (OuterVolumeSpecName: "kube-api-access-ph84v") pod "5e125caa-a643-4a1e-81bc-e9983b90b640" (UID: "5e125caa-a643-4a1e-81bc-e9983b90b640"). InnerVolumeSpecName "kube-api-access-ph84v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.332084 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5e125caa-a643-4a1e-81bc-e9983b90b640" (UID: "5e125caa-a643-4a1e-81bc-e9983b90b640"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.424113 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69df688-29bc-47e8-98ef-56b506f9e7c1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c69df688-29bc-47e8-98ef-56b506f9e7c1\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.424221 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69df688-29bc-47e8-98ef-56b506f9e7c1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c69df688-29bc-47e8-98ef-56b506f9e7c1\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.424523 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz6s6\" (UniqueName: \"kubernetes.io/projected/c69df688-29bc-47e8-98ef-56b506f9e7c1-kube-api-access-nz6s6\") pod \"nova-cell0-conductor-0\" (UID: \"c69df688-29bc-47e8-98ef-56b506f9e7c1\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.425376 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.425405 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.425422 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph84v\" (UniqueName: \"kubernetes.io/projected/5e125caa-a643-4a1e-81bc-e9983b90b640-kube-api-access-ph84v\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.425440 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.425449 4937 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e125caa-a643-4a1e-81bc-e9983b90b640-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.428247 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69df688-29bc-47e8-98ef-56b506f9e7c1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c69df688-29bc-47e8-98ef-56b506f9e7c1\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.430112 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69df688-29bc-47e8-98ef-56b506f9e7c1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c69df688-29bc-47e8-98ef-56b506f9e7c1\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.452790 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz6s6\" (UniqueName: \"kubernetes.io/projected/c69df688-29bc-47e8-98ef-56b506f9e7c1-kube-api-access-nz6s6\") pod \"nova-cell0-conductor-0\" (UID: \"c69df688-29bc-47e8-98ef-56b506f9e7c1\") " pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:26 crc kubenswrapper[4937]: I0225 16:14:26.720468 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.236333 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.370717 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.420060 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55869127-b091-47f7-a8ee-02fecf833efb" path="/var/lib/kubelet/pods/55869127-b091-47f7-a8ee-02fecf833efb/volumes" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.420902 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.420936 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.424009 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.427082 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.427329 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.441209 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.559680 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.559834 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.559883 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7030cdee-8f14-4cd8-959a-f941ac0414e9-run-httpd\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.559943 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pgvj\" (UniqueName: \"kubernetes.io/projected/7030cdee-8f14-4cd8-959a-f941ac0414e9-kube-api-access-7pgvj\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.559976 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7030cdee-8f14-4cd8-959a-f941ac0414e9-log-httpd\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.560026 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-scripts\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.560049 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-config-data\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.662113 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.662226 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.662258 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7030cdee-8f14-4cd8-959a-f941ac0414e9-run-httpd\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.662296 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pgvj\" (UniqueName: \"kubernetes.io/projected/7030cdee-8f14-4cd8-959a-f941ac0414e9-kube-api-access-7pgvj\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.662328 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7030cdee-8f14-4cd8-959a-f941ac0414e9-log-httpd\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.662359 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-scripts\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.662376 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-config-data\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.663035 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7030cdee-8f14-4cd8-959a-f941ac0414e9-log-httpd\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.663287 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7030cdee-8f14-4cd8-959a-f941ac0414e9-run-httpd\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.675404 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.676454 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-scripts\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.676776 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-config-data\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.683226 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.687143 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pgvj\" (UniqueName: \"kubernetes.io/projected/7030cdee-8f14-4cd8-959a-f941ac0414e9-kube-api-access-7pgvj\") pod \"ceilometer-0\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " pod="openstack/ceilometer-0" Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.731882 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 16:14:27 crc kubenswrapper[4937]: I0225 16:14:27.770846 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:14:28 crc kubenswrapper[4937]: I0225 16:14:28.248426 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c69df688-29bc-47e8-98ef-56b506f9e7c1","Type":"ContainerStarted","Data":"aad19e26b0e0dedcb0baabb4f8f76d4f708eec111e6c318dc401168d12e20c37"} Feb 25 16:14:28 crc kubenswrapper[4937]: I0225 16:14:28.248918 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c69df688-29bc-47e8-98ef-56b506f9e7c1","Type":"ContainerStarted","Data":"8f0a14b414c5e0ec71d070cbd1f3571b6d36f01a336413ed871b3b8b23a848a6"} Feb 25 16:14:28 crc kubenswrapper[4937]: I0225 16:14:28.248942 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:28 crc kubenswrapper[4937]: I0225 16:14:28.267327 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.267311693 podStartE2EDuration="2.267311693s" podCreationTimestamp="2026-02-25 16:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:14:28.260208845 +0000 UTC m=+1719.273600735" watchObservedRunningTime="2026-02-25 16:14:28.267311693 +0000 UTC m=+1719.280703583" Feb 25 16:14:28 crc kubenswrapper[4937]: I0225 16:14:28.308579 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:14:29 crc kubenswrapper[4937]: I0225 16:14:29.258175 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7030cdee-8f14-4cd8-959a-f941ac0414e9","Type":"ContainerStarted","Data":"48448a2d7fdc06cb5701ddae94292da76ca94b3c5afdaae229c71463a10538ad"} Feb 25 16:14:29 crc kubenswrapper[4937]: I0225 16:14:29.380143 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e125caa-a643-4a1e-81bc-e9983b90b640" path="/var/lib/kubelet/pods/5e125caa-a643-4a1e-81bc-e9983b90b640/volumes" Feb 25 16:14:30 crc kubenswrapper[4937]: I0225 16:14:30.269569 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7030cdee-8f14-4cd8-959a-f941ac0414e9","Type":"ContainerStarted","Data":"7380a46d4a13eebb08f667fc944c523adef599821a7a2a344f2c535ed31ed736"} Feb 25 16:14:31 crc kubenswrapper[4937]: I0225 16:14:31.281904 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7030cdee-8f14-4cd8-959a-f941ac0414e9","Type":"ContainerStarted","Data":"da49f7b78ff9b1b0557982a4df2ebde43a9731336dca7991680119793c917b1d"} Feb 25 16:14:32 crc kubenswrapper[4937]: I0225 16:14:32.295147 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7030cdee-8f14-4cd8-959a-f941ac0414e9","Type":"ContainerStarted","Data":"3b8030e81823bf310684ff855a375a6ba05dbac1ba9f0a61fe66a8ce26b61481"} Feb 25 16:14:36 crc kubenswrapper[4937]: I0225 16:14:36.159458 4937 scope.go:117] "RemoveContainer" containerID="8485e4e6c778f1b465c642cc0a1962e713cf730d59aea13a70b205613b03f028" Feb 25 16:14:36 crc kubenswrapper[4937]: I0225 16:14:36.748612 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.230027 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-9ws68"] Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.231282 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.233986 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.234147 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.240563 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9ws68"] Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.367977 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:14:37 crc kubenswrapper[4937]: E0225 16:14:37.368249 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.392282 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxsxk\" (UniqueName: \"kubernetes.io/projected/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-kube-api-access-pxsxk\") pod \"nova-cell0-cell-mapping-9ws68\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.392395 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9ws68\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.392627 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-config-data\") pod \"nova-cell0-cell-mapping-9ws68\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.392832 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-scripts\") pod \"nova-cell0-cell-mapping-9ws68\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.429900 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.432344 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.436408 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.446008 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.497810 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9ws68\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.497916 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-config-data\") pod \"nova-cell0-cell-mapping-9ws68\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.498005 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-scripts\") pod \"nova-cell0-cell-mapping-9ws68\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.498133 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxsxk\" (UniqueName: \"kubernetes.io/projected/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-kube-api-access-pxsxk\") pod \"nova-cell0-cell-mapping-9ws68\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.509225 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-config-data\") pod \"nova-cell0-cell-mapping-9ws68\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.511030 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-scripts\") pod \"nova-cell0-cell-mapping-9ws68\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.513531 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9ws68\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.532032 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxsxk\" (UniqueName: \"kubernetes.io/projected/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-kube-api-access-pxsxk\") pod \"nova-cell0-cell-mapping-9ws68\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.557625 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.586573 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.588226 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.592532 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.605233 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db31ec58-43e7-4625-9551-84405e20c3f5-logs\") pod \"nova-api-0\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " pod="openstack/nova-api-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.605348 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db31ec58-43e7-4625-9551-84405e20c3f5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " pod="openstack/nova-api-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.605395 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db31ec58-43e7-4625-9551-84405e20c3f5-config-data\") pod \"nova-api-0\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " pod="openstack/nova-api-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.605503 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fmt9\" (UniqueName: \"kubernetes.io/projected/db31ec58-43e7-4625-9551-84405e20c3f5-kube-api-access-4fmt9\") pod \"nova-api-0\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " pod="openstack/nova-api-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.617173 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.645573 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.652215 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.661603 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.679365 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.710625 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fmt9\" (UniqueName: \"kubernetes.io/projected/db31ec58-43e7-4625-9551-84405e20c3f5-kube-api-access-4fmt9\") pod \"nova-api-0\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " pod="openstack/nova-api-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.710689 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db31ec58-43e7-4625-9551-84405e20c3f5-logs\") pod \"nova-api-0\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " pod="openstack/nova-api-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.710761 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db31ec58-43e7-4625-9551-84405e20c3f5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " pod="openstack/nova-api-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.710796 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " pod="openstack/nova-metadata-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.710820 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db31ec58-43e7-4625-9551-84405e20c3f5-config-data\") pod \"nova-api-0\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " pod="openstack/nova-api-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.710839 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-logs\") pod \"nova-metadata-0\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " pod="openstack/nova-metadata-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.710852 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz89r\" (UniqueName: \"kubernetes.io/projected/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-kube-api-access-rz89r\") pod \"nova-metadata-0\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " pod="openstack/nova-metadata-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.710900 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-config-data\") pod \"nova-metadata-0\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " pod="openstack/nova-metadata-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.711852 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db31ec58-43e7-4625-9551-84405e20c3f5-logs\") pod \"nova-api-0\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " pod="openstack/nova-api-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.722394 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db31ec58-43e7-4625-9551-84405e20c3f5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " pod="openstack/nova-api-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.726152 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db31ec58-43e7-4625-9551-84405e20c3f5-config-data\") pod \"nova-api-0\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " pod="openstack/nova-api-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.739678 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fmt9\" (UniqueName: \"kubernetes.io/projected/db31ec58-43e7-4625-9551-84405e20c3f5-kube-api-access-4fmt9\") pod \"nova-api-0\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " pod="openstack/nova-api-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.758009 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.785189 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cd565959-vm27p"] Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.787792 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.813573 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-logs\") pod \"nova-metadata-0\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " pod="openstack/nova-metadata-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.813606 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz89r\" (UniqueName: \"kubernetes.io/projected/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-kube-api-access-rz89r\") pod \"nova-metadata-0\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " pod="openstack/nova-metadata-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.813654 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1670d95c-7850-442b-8b88-4b1d20464736-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1670d95c-7850-442b-8b88-4b1d20464736\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.813687 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-config-data\") pod \"nova-metadata-0\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " pod="openstack/nova-metadata-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.813766 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxlnd\" (UniqueName: \"kubernetes.io/projected/1670d95c-7850-442b-8b88-4b1d20464736-kube-api-access-gxlnd\") pod \"nova-scheduler-0\" (UID: \"1670d95c-7850-442b-8b88-4b1d20464736\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.813831 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1670d95c-7850-442b-8b88-4b1d20464736-config-data\") pod \"nova-scheduler-0\" (UID: \"1670d95c-7850-442b-8b88-4b1d20464736\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.813869 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " pod="openstack/nova-metadata-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.822001 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-logs\") pod \"nova-metadata-0\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " pod="openstack/nova-metadata-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.824918 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " pod="openstack/nova-metadata-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.836955 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-config-data\") pod \"nova-metadata-0\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " pod="openstack/nova-metadata-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.841222 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-vm27p"] Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.845001 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz89r\" (UniqueName: \"kubernetes.io/projected/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-kube-api-access-rz89r\") pod \"nova-metadata-0\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " pod="openstack/nova-metadata-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.858897 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.860493 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.864557 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.882515 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.918650 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1670d95c-7850-442b-8b88-4b1d20464736-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1670d95c-7850-442b-8b88-4b1d20464736\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.918791 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtk7l\" (UniqueName: \"kubernetes.io/projected/22da4be7-1bfd-4df2-a66b-8bd47f08269c-kube-api-access-wtk7l\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.918857 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.918934 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxlnd\" (UniqueName: \"kubernetes.io/projected/1670d95c-7850-442b-8b88-4b1d20464736-kube-api-access-gxlnd\") pod \"nova-scheduler-0\" (UID: \"1670d95c-7850-442b-8b88-4b1d20464736\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.918974 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.919060 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.919102 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-config\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.919145 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1670d95c-7850-442b-8b88-4b1d20464736-config-data\") pod \"nova-scheduler-0\" (UID: \"1670d95c-7850-442b-8b88-4b1d20464736\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.919223 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-dns-svc\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.926799 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1670d95c-7850-442b-8b88-4b1d20464736-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1670d95c-7850-442b-8b88-4b1d20464736\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.930663 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1670d95c-7850-442b-8b88-4b1d20464736-config-data\") pod \"nova-scheduler-0\" (UID: \"1670d95c-7850-442b-8b88-4b1d20464736\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:37 crc kubenswrapper[4937]: I0225 16:14:37.951853 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxlnd\" (UniqueName: \"kubernetes.io/projected/1670d95c-7850-442b-8b88-4b1d20464736-kube-api-access-gxlnd\") pod \"nova-scheduler-0\" (UID: \"1670d95c-7850-442b-8b88-4b1d20464736\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.023111 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtk7l\" (UniqueName: \"kubernetes.io/projected/22da4be7-1bfd-4df2-a66b-8bd47f08269c-kube-api-access-wtk7l\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.024463 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4mcf\" (UniqueName: \"kubernetes.io/projected/692747a2-c012-4264-8346-f4aa6755f93c-kube-api-access-t4mcf\") pod \"nova-cell1-novncproxy-0\" (UID: \"692747a2-c012-4264-8346-f4aa6755f93c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.024807 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.024852 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/692747a2-c012-4264-8346-f4aa6755f93c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"692747a2-c012-4264-8346-f4aa6755f93c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.025004 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.025228 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.025812 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.027452 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.033797 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.033893 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/692747a2-c012-4264-8346-f4aa6755f93c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"692747a2-c012-4264-8346-f4aa6755f93c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.033937 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-config\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.034063 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-dns-svc\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.035277 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-config\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.036216 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-dns-svc\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.046052 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtk7l\" (UniqueName: \"kubernetes.io/projected/22da4be7-1bfd-4df2-a66b-8bd47f08269c-kube-api-access-wtk7l\") pod \"dnsmasq-dns-78cd565959-vm27p\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.138379 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4mcf\" (UniqueName: \"kubernetes.io/projected/692747a2-c012-4264-8346-f4aa6755f93c-kube-api-access-t4mcf\") pod \"nova-cell1-novncproxy-0\" (UID: \"692747a2-c012-4264-8346-f4aa6755f93c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.138446 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/692747a2-c012-4264-8346-f4aa6755f93c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"692747a2-c012-4264-8346-f4aa6755f93c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.138637 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/692747a2-c012-4264-8346-f4aa6755f93c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"692747a2-c012-4264-8346-f4aa6755f93c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.140087 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.144301 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/692747a2-c012-4264-8346-f4aa6755f93c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"692747a2-c012-4264-8346-f4aa6755f93c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.145273 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/692747a2-c012-4264-8346-f4aa6755f93c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"692747a2-c012-4264-8346-f4aa6755f93c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.157028 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.168141 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.171014 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4mcf\" (UniqueName: \"kubernetes.io/projected/692747a2-c012-4264-8346-f4aa6755f93c-kube-api-access-t4mcf\") pod \"nova-cell1-novncproxy-0\" (UID: \"692747a2-c012-4264-8346-f4aa6755f93c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.187432 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.462573 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9ws68"] Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.477970 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:14:38 crc kubenswrapper[4937]: W0225 16:14:38.575609 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb31ec58_43e7_4625_9551_84405e20c3f5.slice/crio-53baa6ef8a7caa4c908549a7faca413bbce3c7739779f90969f6a27464698f98 WatchSource:0}: Error finding container 53baa6ef8a7caa4c908549a7faca413bbce3c7739779f90969f6a27464698f98: Status 404 returned error can't find the container with id 53baa6ef8a7caa4c908549a7faca413bbce3c7739779f90969f6a27464698f98 Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.611095 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4sssx"] Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.613049 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.618367 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.622757 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.662947 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-config-data\") pod \"nova-cell1-conductor-db-sync-4sssx\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.663006 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md7t9\" (UniqueName: \"kubernetes.io/projected/8a2ea03f-b76a-4775-b4ee-827cc43744c1-kube-api-access-md7t9\") pod \"nova-cell1-conductor-db-sync-4sssx\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.663086 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4sssx\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.663151 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-scripts\") pod \"nova-cell1-conductor-db-sync-4sssx\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.678462 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4sssx"] Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.767758 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-config-data\") pod \"nova-cell1-conductor-db-sync-4sssx\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.767824 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md7t9\" (UniqueName: \"kubernetes.io/projected/8a2ea03f-b76a-4775-b4ee-827cc43744c1-kube-api-access-md7t9\") pod \"nova-cell1-conductor-db-sync-4sssx\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.768006 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4sssx\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.768157 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-scripts\") pod \"nova-cell1-conductor-db-sync-4sssx\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.773049 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-config-data\") pod \"nova-cell1-conductor-db-sync-4sssx\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.774048 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-scripts\") pod \"nova-cell1-conductor-db-sync-4sssx\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.789539 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md7t9\" (UniqueName: \"kubernetes.io/projected/8a2ea03f-b76a-4775-b4ee-827cc43744c1-kube-api-access-md7t9\") pod \"nova-cell1-conductor-db-sync-4sssx\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.793143 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4sssx\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.872688 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db31ec58-43e7-4625-9551-84405e20c3f5","Type":"ContainerStarted","Data":"53baa6ef8a7caa4c908549a7faca413bbce3c7739779f90969f6a27464698f98"} Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.874051 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9ws68" event={"ID":"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28","Type":"ContainerStarted","Data":"722d68ceac1f97c959fb1d67ab4d13575be7102fd3836858ba1f7126552e450e"} Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.916344 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:38 crc kubenswrapper[4937]: I0225 16:14:38.944362 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.072371 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-vm27p"] Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.097646 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.324402 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.647759 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4sssx"] Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.918938 4937 generic.go:334] "Generic (PLEG): container finished" podID="22da4be7-1bfd-4df2-a66b-8bd47f08269c" containerID="ee001256fc234aa33337e8a6b6fb7c4dbb4f3469fb2eb31fab62ba5c262dd7bf" exitCode=0 Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.919121 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-vm27p" event={"ID":"22da4be7-1bfd-4df2-a66b-8bd47f08269c","Type":"ContainerDied","Data":"ee001256fc234aa33337e8a6b6fb7c4dbb4f3469fb2eb31fab62ba5c262dd7bf"} Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.919229 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-vm27p" event={"ID":"22da4be7-1bfd-4df2-a66b-8bd47f08269c","Type":"ContainerStarted","Data":"a93ffa38a03f0faea423b344ac23aeaba4b1e5a5d5d052810a0c558495c3ceac"} Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.921520 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1670d95c-7850-442b-8b88-4b1d20464736","Type":"ContainerStarted","Data":"955d55f43a2037fb0574b21c5a7c388b8880345a6114c1ef9bfb33d0e1fcc584"} Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.929827 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a72f6cf7-77da-41a8-ae53-eb3481d4bc55","Type":"ContainerStarted","Data":"25f3418ffe44555beb696c993e3c298b5eea7083d03a00145ba92ff580af69e8"} Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.938477 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9ws68" event={"ID":"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28","Type":"ContainerStarted","Data":"f833be4dfeb87ff3970dca032bfe28374fa43cab8bcdd748c1c3405ec0a1b478"} Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.953737 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4sssx" event={"ID":"8a2ea03f-b76a-4775-b4ee-827cc43744c1","Type":"ContainerStarted","Data":"67709ca9aa59f7bbcd054be88a08866a57a2217869db089270bc1a4e6627a173"} Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.961618 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7030cdee-8f14-4cd8-959a-f941ac0414e9","Type":"ContainerStarted","Data":"e3d89d9183e76e7e19ae1f8aadba0108919676add1b655fc803e234fe6f91386"} Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.962668 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.964884 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"692747a2-c012-4264-8346-f4aa6755f93c","Type":"ContainerStarted","Data":"795571e170388f79ef9c790373e9b93225a6a490e8909e22e6f5e9e1e3bfd58e"} Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.979947 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-9ws68" podStartSLOduration=2.97993086 podStartE2EDuration="2.97993086s" podCreationTimestamp="2026-02-25 16:14:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:14:39.967380505 +0000 UTC m=+1730.980772405" watchObservedRunningTime="2026-02-25 16:14:39.97993086 +0000 UTC m=+1730.993322750" Feb 25 16:14:39 crc kubenswrapper[4937]: I0225 16:14:39.996257 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.698488594 podStartE2EDuration="12.996241998s" podCreationTimestamp="2026-02-25 16:14:27 +0000 UTC" firstStartedPulling="2026-02-25 16:14:28.311761237 +0000 UTC m=+1719.325153147" lastFinishedPulling="2026-02-25 16:14:38.609514661 +0000 UTC m=+1729.622906551" observedRunningTime="2026-02-25 16:14:39.988222957 +0000 UTC m=+1731.001614877" watchObservedRunningTime="2026-02-25 16:14:39.996241998 +0000 UTC m=+1731.009633888" Feb 25 16:14:40 crc kubenswrapper[4937]: I0225 16:14:40.977352 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4sssx" event={"ID":"8a2ea03f-b76a-4775-b4ee-827cc43744c1","Type":"ContainerStarted","Data":"818b0815324ce9f2eb9cf06dc48a11c257685002beb36cff09bf2a15122fa9df"} Feb 25 16:14:40 crc kubenswrapper[4937]: I0225 16:14:40.986540 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-vm27p" event={"ID":"22da4be7-1bfd-4df2-a66b-8bd47f08269c","Type":"ContainerStarted","Data":"4fce1db9ef8aed180141acc2e44adc14eee51286f48564b0d2fdf56c0b65e530"} Feb 25 16:14:40 crc kubenswrapper[4937]: I0225 16:14:40.987846 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:41 crc kubenswrapper[4937]: I0225 16:14:41.028643 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cd565959-vm27p" podStartSLOduration=4.028622188 podStartE2EDuration="4.028622188s" podCreationTimestamp="2026-02-25 16:14:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:14:41.019551391 +0000 UTC m=+1732.032943291" watchObservedRunningTime="2026-02-25 16:14:41.028622188 +0000 UTC m=+1732.042014078" Feb 25 16:14:41 crc kubenswrapper[4937]: I0225 16:14:41.035229 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-4sssx" podStartSLOduration=3.035202553 podStartE2EDuration="3.035202553s" podCreationTimestamp="2026-02-25 16:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:14:40.999727384 +0000 UTC m=+1732.013119274" watchObservedRunningTime="2026-02-25 16:14:41.035202553 +0000 UTC m=+1732.048594443" Feb 25 16:14:41 crc kubenswrapper[4937]: I0225 16:14:41.234108 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 16:14:41 crc kubenswrapper[4937]: I0225 16:14:41.244612 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:42 crc kubenswrapper[4937]: I0225 16:14:42.933613 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2vtpf"] Feb 25 16:14:42 crc kubenswrapper[4937]: I0225 16:14:42.940138 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:14:42 crc kubenswrapper[4937]: I0225 16:14:42.956766 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2vtpf"] Feb 25 16:14:43 crc kubenswrapper[4937]: I0225 16:14:43.074993 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlk9g\" (UniqueName: \"kubernetes.io/projected/2462d0bc-6986-4337-a05a-863a45a45393-kube-api-access-dlk9g\") pod \"community-operators-2vtpf\" (UID: \"2462d0bc-6986-4337-a05a-863a45a45393\") " pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:14:43 crc kubenswrapper[4937]: I0225 16:14:43.075210 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2462d0bc-6986-4337-a05a-863a45a45393-catalog-content\") pod \"community-operators-2vtpf\" (UID: \"2462d0bc-6986-4337-a05a-863a45a45393\") " pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:14:43 crc kubenswrapper[4937]: I0225 16:14:43.075546 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2462d0bc-6986-4337-a05a-863a45a45393-utilities\") pod \"community-operators-2vtpf\" (UID: \"2462d0bc-6986-4337-a05a-863a45a45393\") " pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:14:43 crc kubenswrapper[4937]: I0225 16:14:43.178074 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2462d0bc-6986-4337-a05a-863a45a45393-utilities\") pod \"community-operators-2vtpf\" (UID: \"2462d0bc-6986-4337-a05a-863a45a45393\") " pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:14:43 crc kubenswrapper[4937]: I0225 16:14:43.178341 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlk9g\" (UniqueName: \"kubernetes.io/projected/2462d0bc-6986-4337-a05a-863a45a45393-kube-api-access-dlk9g\") pod \"community-operators-2vtpf\" (UID: \"2462d0bc-6986-4337-a05a-863a45a45393\") " pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:14:43 crc kubenswrapper[4937]: I0225 16:14:43.178416 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2462d0bc-6986-4337-a05a-863a45a45393-catalog-content\") pod \"community-operators-2vtpf\" (UID: \"2462d0bc-6986-4337-a05a-863a45a45393\") " pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:14:43 crc kubenswrapper[4937]: I0225 16:14:43.179274 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2462d0bc-6986-4337-a05a-863a45a45393-catalog-content\") pod \"community-operators-2vtpf\" (UID: \"2462d0bc-6986-4337-a05a-863a45a45393\") " pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:14:43 crc kubenswrapper[4937]: I0225 16:14:43.179357 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2462d0bc-6986-4337-a05a-863a45a45393-utilities\") pod \"community-operators-2vtpf\" (UID: \"2462d0bc-6986-4337-a05a-863a45a45393\") " pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:14:43 crc kubenswrapper[4937]: I0225 16:14:43.201287 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlk9g\" (UniqueName: \"kubernetes.io/projected/2462d0bc-6986-4337-a05a-863a45a45393-kube-api-access-dlk9g\") pod \"community-operators-2vtpf\" (UID: \"2462d0bc-6986-4337-a05a-863a45a45393\") " pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:14:43 crc kubenswrapper[4937]: I0225 16:14:43.268413 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:14:44 crc kubenswrapper[4937]: I0225 16:14:44.046886 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a72f6cf7-77da-41a8-ae53-eb3481d4bc55","Type":"ContainerStarted","Data":"8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593"} Feb 25 16:14:44 crc kubenswrapper[4937]: I0225 16:14:44.051833 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db31ec58-43e7-4625-9551-84405e20c3f5","Type":"ContainerStarted","Data":"2803d653b814fdf03abaca60765f621f6c161783bcd9ca50898875c9d5857eec"} Feb 25 16:14:44 crc kubenswrapper[4937]: I0225 16:14:44.054592 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"692747a2-c012-4264-8346-f4aa6755f93c","Type":"ContainerStarted","Data":"621a0e9511587d4cf0fcdbcba93a91cd49682927e54602f1421c64484dd6aaa1"} Feb 25 16:14:44 crc kubenswrapper[4937]: I0225 16:14:44.054727 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="692747a2-c012-4264-8346-f4aa6755f93c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://621a0e9511587d4cf0fcdbcba93a91cd49682927e54602f1421c64484dd6aaa1" gracePeriod=30 Feb 25 16:14:44 crc kubenswrapper[4937]: I0225 16:14:44.059868 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1670d95c-7850-442b-8b88-4b1d20464736","Type":"ContainerStarted","Data":"9f107e3aff32b2d3d5ee0344274edcfd799064835c2aaa615956b5230d30a2e9"} Feb 25 16:14:44 crc kubenswrapper[4937]: I0225 16:14:44.072783 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.883258387 podStartE2EDuration="7.072767513s" podCreationTimestamp="2026-02-25 16:14:37 +0000 UTC" firstStartedPulling="2026-02-25 16:14:39.447216785 +0000 UTC m=+1730.460608685" lastFinishedPulling="2026-02-25 16:14:43.636725921 +0000 UTC m=+1734.650117811" observedRunningTime="2026-02-25 16:14:44.072074515 +0000 UTC m=+1735.085466445" watchObservedRunningTime="2026-02-25 16:14:44.072767513 +0000 UTC m=+1735.086159403" Feb 25 16:14:44 crc kubenswrapper[4937]: I0225 16:14:44.100840 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.463054581 podStartE2EDuration="7.100821775s" podCreationTimestamp="2026-02-25 16:14:37 +0000 UTC" firstStartedPulling="2026-02-25 16:14:38.994007753 +0000 UTC m=+1730.007399653" lastFinishedPulling="2026-02-25 16:14:43.631774957 +0000 UTC m=+1734.645166847" observedRunningTime="2026-02-25 16:14:44.087663636 +0000 UTC m=+1735.101055526" watchObservedRunningTime="2026-02-25 16:14:44.100821775 +0000 UTC m=+1735.114213665" Feb 25 16:14:44 crc kubenswrapper[4937]: I0225 16:14:44.185948 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2vtpf"] Feb 25 16:14:44 crc kubenswrapper[4937]: W0225 16:14:44.193644 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2462d0bc_6986_4337_a05a_863a45a45393.slice/crio-850f264ffaa1d6cbdabe55ca322e0926480f945f8acea36c9b2b7a648a12b164 WatchSource:0}: Error finding container 850f264ffaa1d6cbdabe55ca322e0926480f945f8acea36c9b2b7a648a12b164: Status 404 returned error can't find the container with id 850f264ffaa1d6cbdabe55ca322e0926480f945f8acea36c9b2b7a648a12b164 Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.072446 4937 generic.go:334] "Generic (PLEG): container finished" podID="2462d0bc-6986-4337-a05a-863a45a45393" containerID="61568512f7ea525a4b8390c901299ac1af75e738759daeb330cf920b82a9980e" exitCode=0 Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.072533 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2vtpf" event={"ID":"2462d0bc-6986-4337-a05a-863a45a45393","Type":"ContainerDied","Data":"61568512f7ea525a4b8390c901299ac1af75e738759daeb330cf920b82a9980e"} Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.072779 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2vtpf" event={"ID":"2462d0bc-6986-4337-a05a-863a45a45393","Type":"ContainerStarted","Data":"850f264ffaa1d6cbdabe55ca322e0926480f945f8acea36c9b2b7a648a12b164"} Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.075438 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a72f6cf7-77da-41a8-ae53-eb3481d4bc55","Type":"ContainerStarted","Data":"7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c"} Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.075549 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a72f6cf7-77da-41a8-ae53-eb3481d4bc55" containerName="nova-metadata-log" containerID="cri-o://8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593" gracePeriod=30 Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.075568 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a72f6cf7-77da-41a8-ae53-eb3481d4bc55" containerName="nova-metadata-metadata" containerID="cri-o://7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c" gracePeriod=30 Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.087360 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db31ec58-43e7-4625-9551-84405e20c3f5","Type":"ContainerStarted","Data":"9a434b3980ce99761289913c0f390dea1b1b6729bebdc1de58d5004c3737f823"} Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.127434 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.107720059 podStartE2EDuration="8.127415501s" podCreationTimestamp="2026-02-25 16:14:37 +0000 UTC" firstStartedPulling="2026-02-25 16:14:38.597714345 +0000 UTC m=+1729.611106235" lastFinishedPulling="2026-02-25 16:14:43.617409787 +0000 UTC m=+1734.630801677" observedRunningTime="2026-02-25 16:14:45.120650962 +0000 UTC m=+1736.134042852" watchObservedRunningTime="2026-02-25 16:14:45.127415501 +0000 UTC m=+1736.140807391" Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.157924 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.638852204 podStartE2EDuration="8.157895165s" podCreationTimestamp="2026-02-25 16:14:37 +0000 UTC" firstStartedPulling="2026-02-25 16:14:39.117638769 +0000 UTC m=+1730.131030659" lastFinishedPulling="2026-02-25 16:14:43.63668172 +0000 UTC m=+1734.650073620" observedRunningTime="2026-02-25 16:14:45.146447288 +0000 UTC m=+1736.159839178" watchObservedRunningTime="2026-02-25 16:14:45.157895165 +0000 UTC m=+1736.171287075" Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.859956 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.993603 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-combined-ca-bundle\") pod \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.993674 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz89r\" (UniqueName: \"kubernetes.io/projected/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-kube-api-access-rz89r\") pod \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.993773 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-config-data\") pod \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.993815 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-logs\") pod \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\" (UID: \"a72f6cf7-77da-41a8-ae53-eb3481d4bc55\") " Feb 25 16:14:45 crc kubenswrapper[4937]: I0225 16:14:45.994671 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-logs" (OuterVolumeSpecName: "logs") pod "a72f6cf7-77da-41a8-ae53-eb3481d4bc55" (UID: "a72f6cf7-77da-41a8-ae53-eb3481d4bc55"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.000986 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-kube-api-access-rz89r" (OuterVolumeSpecName: "kube-api-access-rz89r") pod "a72f6cf7-77da-41a8-ae53-eb3481d4bc55" (UID: "a72f6cf7-77da-41a8-ae53-eb3481d4bc55"). InnerVolumeSpecName "kube-api-access-rz89r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.023961 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a72f6cf7-77da-41a8-ae53-eb3481d4bc55" (UID: "a72f6cf7-77da-41a8-ae53-eb3481d4bc55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.040948 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-config-data" (OuterVolumeSpecName: "config-data") pod "a72f6cf7-77da-41a8-ae53-eb3481d4bc55" (UID: "a72f6cf7-77da-41a8-ae53-eb3481d4bc55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.095758 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.095791 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz89r\" (UniqueName: \"kubernetes.io/projected/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-kube-api-access-rz89r\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.095804 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.095812 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72f6cf7-77da-41a8-ae53-eb3481d4bc55-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.109712 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2vtpf" event={"ID":"2462d0bc-6986-4337-a05a-863a45a45393","Type":"ContainerStarted","Data":"f8c96562bdcba732dfa1548fc9056841f26f245f11b1d0e5fac61d7755725289"} Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.114819 4937 generic.go:334] "Generic (PLEG): container finished" podID="a72f6cf7-77da-41a8-ae53-eb3481d4bc55" containerID="7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c" exitCode=0 Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.114847 4937 generic.go:334] "Generic (PLEG): container finished" podID="a72f6cf7-77da-41a8-ae53-eb3481d4bc55" containerID="8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593" exitCode=143 Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.114893 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a72f6cf7-77da-41a8-ae53-eb3481d4bc55","Type":"ContainerDied","Data":"7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c"} Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.114949 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a72f6cf7-77da-41a8-ae53-eb3481d4bc55","Type":"ContainerDied","Data":"8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593"} Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.114963 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a72f6cf7-77da-41a8-ae53-eb3481d4bc55","Type":"ContainerDied","Data":"25f3418ffe44555beb696c993e3c298b5eea7083d03a00145ba92ff580af69e8"} Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.114982 4937 scope.go:117] "RemoveContainer" containerID="7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.114913 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.156442 4937 scope.go:117] "RemoveContainer" containerID="8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.163666 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.177200 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.191247 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:46 crc kubenswrapper[4937]: E0225 16:14:46.191846 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72f6cf7-77da-41a8-ae53-eb3481d4bc55" containerName="nova-metadata-log" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.191864 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72f6cf7-77da-41a8-ae53-eb3481d4bc55" containerName="nova-metadata-log" Feb 25 16:14:46 crc kubenswrapper[4937]: E0225 16:14:46.191888 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72f6cf7-77da-41a8-ae53-eb3481d4bc55" containerName="nova-metadata-metadata" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.191896 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72f6cf7-77da-41a8-ae53-eb3481d4bc55" containerName="nova-metadata-metadata" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.192208 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72f6cf7-77da-41a8-ae53-eb3481d4bc55" containerName="nova-metadata-metadata" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.192249 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72f6cf7-77da-41a8-ae53-eb3481d4bc55" containerName="nova-metadata-log" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.193716 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.197413 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.197468 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.225214 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.250736 4937 scope.go:117] "RemoveContainer" containerID="7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c" Feb 25 16:14:46 crc kubenswrapper[4937]: E0225 16:14:46.252927 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c\": container with ID starting with 7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c not found: ID does not exist" containerID="7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.252983 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c"} err="failed to get container status \"7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c\": rpc error: code = NotFound desc = could not find container \"7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c\": container with ID starting with 7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c not found: ID does not exist" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.253017 4937 scope.go:117] "RemoveContainer" containerID="8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593" Feb 25 16:14:46 crc kubenswrapper[4937]: E0225 16:14:46.254994 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593\": container with ID starting with 8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593 not found: ID does not exist" containerID="8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.255028 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593"} err="failed to get container status \"8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593\": rpc error: code = NotFound desc = could not find container \"8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593\": container with ID starting with 8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593 not found: ID does not exist" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.255048 4937 scope.go:117] "RemoveContainer" containerID="7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.270676 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c"} err="failed to get container status \"7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c\": rpc error: code = NotFound desc = could not find container \"7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c\": container with ID starting with 7e24c0a7d235fa4a041e5fa73522858aa9d9267682e6e467ac58e5fd785e1f0c not found: ID does not exist" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.270730 4937 scope.go:117] "RemoveContainer" containerID="8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.274632 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593"} err="failed to get container status \"8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593\": rpc error: code = NotFound desc = could not find container \"8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593\": container with ID starting with 8ac9044e5a067af099e0ecd3f8b4e99d56d73d0c5cb3268886b420fa33118593 not found: ID does not exist" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.303543 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.303630 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b1f91f-d0f3-450e-aaf4-246815fcade3-logs\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.303675 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-config-data\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.303740 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.303813 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svtzd\" (UniqueName: \"kubernetes.io/projected/e0b1f91f-d0f3-450e-aaf4-246815fcade3-kube-api-access-svtzd\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.406854 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.406919 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b1f91f-d0f3-450e-aaf4-246815fcade3-logs\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.406958 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-config-data\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.407002 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.407054 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svtzd\" (UniqueName: \"kubernetes.io/projected/e0b1f91f-d0f3-450e-aaf4-246815fcade3-kube-api-access-svtzd\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.408055 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b1f91f-d0f3-450e-aaf4-246815fcade3-logs\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.412363 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.413420 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-config-data\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.415509 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.428002 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svtzd\" (UniqueName: \"kubernetes.io/projected/e0b1f91f-d0f3-450e-aaf4-246815fcade3-kube-api-access-svtzd\") pod \"nova-metadata-0\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " pod="openstack/nova-metadata-0" Feb 25 16:14:46 crc kubenswrapper[4937]: I0225 16:14:46.562403 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 16:14:47 crc kubenswrapper[4937]: I0225 16:14:47.065786 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:47 crc kubenswrapper[4937]: W0225 16:14:47.069536 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0b1f91f_d0f3_450e_aaf4_246815fcade3.slice/crio-4402b8d878fdbc2494fe4072250b0f4236bb83dd5ff2a45298dfad09dbd1075c WatchSource:0}: Error finding container 4402b8d878fdbc2494fe4072250b0f4236bb83dd5ff2a45298dfad09dbd1075c: Status 404 returned error can't find the container with id 4402b8d878fdbc2494fe4072250b0f4236bb83dd5ff2a45298dfad09dbd1075c Feb 25 16:14:47 crc kubenswrapper[4937]: I0225 16:14:47.126452 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0b1f91f-d0f3-450e-aaf4-246815fcade3","Type":"ContainerStarted","Data":"4402b8d878fdbc2494fe4072250b0f4236bb83dd5ff2a45298dfad09dbd1075c"} Feb 25 16:14:47 crc kubenswrapper[4937]: I0225 16:14:47.380890 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a72f6cf7-77da-41a8-ae53-eb3481d4bc55" path="/var/lib/kubelet/pods/a72f6cf7-77da-41a8-ae53-eb3481d4bc55/volumes" Feb 25 16:14:47 crc kubenswrapper[4937]: I0225 16:14:47.758922 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 16:14:47 crc kubenswrapper[4937]: I0225 16:14:47.759344 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.142720 4937 generic.go:334] "Generic (PLEG): container finished" podID="2462d0bc-6986-4337-a05a-863a45a45393" containerID="f8c96562bdcba732dfa1548fc9056841f26f245f11b1d0e5fac61d7755725289" exitCode=0 Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.142786 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2vtpf" event={"ID":"2462d0bc-6986-4337-a05a-863a45a45393","Type":"ContainerDied","Data":"f8c96562bdcba732dfa1548fc9056841f26f245f11b1d0e5fac61d7755725289"} Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.145189 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0b1f91f-d0f3-450e-aaf4-246815fcade3","Type":"ContainerStarted","Data":"fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65"} Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.145250 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0b1f91f-d0f3-450e-aaf4-246815fcade3","Type":"ContainerStarted","Data":"dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542"} Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.148798 4937 generic.go:334] "Generic (PLEG): container finished" podID="4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28" containerID="f833be4dfeb87ff3970dca032bfe28374fa43cab8bcdd748c1c3405ec0a1b478" exitCode=0 Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.148869 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9ws68" event={"ID":"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28","Type":"ContainerDied","Data":"f833be4dfeb87ff3970dca032bfe28374fa43cab8bcdd748c1c3405ec0a1b478"} Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.158410 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.158690 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.169724 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.188907 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.194186 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.199823 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.199801173 podStartE2EDuration="2.199801173s" podCreationTimestamp="2026-02-25 16:14:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:14:48.19729156 +0000 UTC m=+1739.210683470" watchObservedRunningTime="2026-02-25 16:14:48.199801173 +0000 UTC m=+1739.213193063" Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.293004 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-xtnr5"] Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.293291 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" podUID="4cfc3af1-e6ed-4ac5-b539-822aecc38181" containerName="dnsmasq-dns" containerID="cri-o://21e7f388a9ba2cddfa57fd9bd86c95641b59d5fc846737f44344255bcf1a8ac6" gracePeriod=10 Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.371637 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:14:48 crc kubenswrapper[4937]: E0225 16:14:48.371902 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.843790 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="db31ec58-43e7-4625-9551-84405e20c3f5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 16:14:48 crc kubenswrapper[4937]: I0225 16:14:48.844403 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="db31ec58-43e7-4625-9551-84405e20c3f5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.068369 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.172007 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-config\") pod \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.172092 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-ovsdbserver-nb\") pod \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.172221 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-dns-swift-storage-0\") pod \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.172250 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-dns-svc\") pod \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.172274 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-ovsdbserver-sb\") pod \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.172306 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvhpp\" (UniqueName: \"kubernetes.io/projected/4cfc3af1-e6ed-4ac5-b539-822aecc38181-kube-api-access-pvhpp\") pod \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.199878 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cfc3af1-e6ed-4ac5-b539-822aecc38181-kube-api-access-pvhpp" (OuterVolumeSpecName: "kube-api-access-pvhpp") pod "4cfc3af1-e6ed-4ac5-b539-822aecc38181" (UID: "4cfc3af1-e6ed-4ac5-b539-822aecc38181"). InnerVolumeSpecName "kube-api-access-pvhpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.287031 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-config" (OuterVolumeSpecName: "config") pod "4cfc3af1-e6ed-4ac5-b539-822aecc38181" (UID: "4cfc3af1-e6ed-4ac5-b539-822aecc38181"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.288181 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4cfc3af1-e6ed-4ac5-b539-822aecc38181" (UID: "4cfc3af1-e6ed-4ac5-b539-822aecc38181"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.288338 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-dns-svc\") pod \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\" (UID: \"4cfc3af1-e6ed-4ac5-b539-822aecc38181\") " Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.288883 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.288899 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvhpp\" (UniqueName: \"kubernetes.io/projected/4cfc3af1-e6ed-4ac5-b539-822aecc38181-kube-api-access-pvhpp\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:49 crc kubenswrapper[4937]: W0225 16:14:49.288965 4937 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4cfc3af1-e6ed-4ac5-b539-822aecc38181/volumes/kubernetes.io~configmap/dns-svc Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.288976 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4cfc3af1-e6ed-4ac5-b539-822aecc38181" (UID: "4cfc3af1-e6ed-4ac5-b539-822aecc38181"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.289720 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2vtpf" event={"ID":"2462d0bc-6986-4337-a05a-863a45a45393","Type":"ContainerStarted","Data":"e76eb058b9376dc15a57d0a8f480801464114829851c166f290a00336b05edcf"} Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.310014 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4cfc3af1-e6ed-4ac5-b539-822aecc38181" (UID: "4cfc3af1-e6ed-4ac5-b539-822aecc38181"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.315075 4937 generic.go:334] "Generic (PLEG): container finished" podID="4cfc3af1-e6ed-4ac5-b539-822aecc38181" containerID="21e7f388a9ba2cddfa57fd9bd86c95641b59d5fc846737f44344255bcf1a8ac6" exitCode=0 Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.315237 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" event={"ID":"4cfc3af1-e6ed-4ac5-b539-822aecc38181","Type":"ContainerDied","Data":"21e7f388a9ba2cddfa57fd9bd86c95641b59d5fc846737f44344255bcf1a8ac6"} Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.315275 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" event={"ID":"4cfc3af1-e6ed-4ac5-b539-822aecc38181","Type":"ContainerDied","Data":"77ffc8fbb64ad75e0108e3b6f483e81c97a1703acb4d7bbdaa0d3557817e2628"} Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.315294 4937 scope.go:117] "RemoveContainer" containerID="21e7f388a9ba2cddfa57fd9bd86c95641b59d5fc846737f44344255bcf1a8ac6" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.316629 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-xtnr5" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.346250 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2vtpf" podStartSLOduration=3.845174952 podStartE2EDuration="7.346232871s" podCreationTimestamp="2026-02-25 16:14:42 +0000 UTC" firstStartedPulling="2026-02-25 16:14:45.074447755 +0000 UTC m=+1736.087839665" lastFinishedPulling="2026-02-25 16:14:48.575505694 +0000 UTC m=+1739.588897584" observedRunningTime="2026-02-25 16:14:49.330852086 +0000 UTC m=+1740.344243976" watchObservedRunningTime="2026-02-25 16:14:49.346232871 +0000 UTC m=+1740.359624761" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.346316 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4cfc3af1-e6ed-4ac5-b539-822aecc38181" (UID: "4cfc3af1-e6ed-4ac5-b539-822aecc38181"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.385909 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4cfc3af1-e6ed-4ac5-b539-822aecc38181" (UID: "4cfc3af1-e6ed-4ac5-b539-822aecc38181"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.400820 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.400853 4937 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.400863 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.400872 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfc3af1-e6ed-4ac5-b539-822aecc38181-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.403475 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.418104 4937 scope.go:117] "RemoveContainer" containerID="ed872c321f61826d18ac43d6421bd6fc0d3e442e201ba2de37e63f957b6528b6" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.464761 4937 scope.go:117] "RemoveContainer" containerID="21e7f388a9ba2cddfa57fd9bd86c95641b59d5fc846737f44344255bcf1a8ac6" Feb 25 16:14:49 crc kubenswrapper[4937]: E0225 16:14:49.465904 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21e7f388a9ba2cddfa57fd9bd86c95641b59d5fc846737f44344255bcf1a8ac6\": container with ID starting with 21e7f388a9ba2cddfa57fd9bd86c95641b59d5fc846737f44344255bcf1a8ac6 not found: ID does not exist" containerID="21e7f388a9ba2cddfa57fd9bd86c95641b59d5fc846737f44344255bcf1a8ac6" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.465943 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21e7f388a9ba2cddfa57fd9bd86c95641b59d5fc846737f44344255bcf1a8ac6"} err="failed to get container status \"21e7f388a9ba2cddfa57fd9bd86c95641b59d5fc846737f44344255bcf1a8ac6\": rpc error: code = NotFound desc = could not find container \"21e7f388a9ba2cddfa57fd9bd86c95641b59d5fc846737f44344255bcf1a8ac6\": container with ID starting with 21e7f388a9ba2cddfa57fd9bd86c95641b59d5fc846737f44344255bcf1a8ac6 not found: ID does not exist" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.465969 4937 scope.go:117] "RemoveContainer" containerID="ed872c321f61826d18ac43d6421bd6fc0d3e442e201ba2de37e63f957b6528b6" Feb 25 16:14:49 crc kubenswrapper[4937]: E0225 16:14:49.466380 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed872c321f61826d18ac43d6421bd6fc0d3e442e201ba2de37e63f957b6528b6\": container with ID starting with ed872c321f61826d18ac43d6421bd6fc0d3e442e201ba2de37e63f957b6528b6 not found: ID does not exist" containerID="ed872c321f61826d18ac43d6421bd6fc0d3e442e201ba2de37e63f957b6528b6" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.466405 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed872c321f61826d18ac43d6421bd6fc0d3e442e201ba2de37e63f957b6528b6"} err="failed to get container status \"ed872c321f61826d18ac43d6421bd6fc0d3e442e201ba2de37e63f957b6528b6\": rpc error: code = NotFound desc = could not find container \"ed872c321f61826d18ac43d6421bd6fc0d3e442e201ba2de37e63f957b6528b6\": container with ID starting with ed872c321f61826d18ac43d6421bd6fc0d3e442e201ba2de37e63f957b6528b6 not found: ID does not exist" Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.669087 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-xtnr5"] Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.704121 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-xtnr5"] Feb 25 16:14:49 crc kubenswrapper[4937]: I0225 16:14:49.913916 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.018390 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxsxk\" (UniqueName: \"kubernetes.io/projected/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-kube-api-access-pxsxk\") pod \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.018591 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-config-data\") pod \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.018716 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-scripts\") pod \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.018753 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-combined-ca-bundle\") pod \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\" (UID: \"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28\") " Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.024900 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-kube-api-access-pxsxk" (OuterVolumeSpecName: "kube-api-access-pxsxk") pod "4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28" (UID: "4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28"). InnerVolumeSpecName "kube-api-access-pxsxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.052474 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-scripts" (OuterVolumeSpecName: "scripts") pod "4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28" (UID: "4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.059830 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28" (UID: "4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.065958 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-config-data" (OuterVolumeSpecName: "config-data") pod "4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28" (UID: "4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.121297 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.121328 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.121353 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxsxk\" (UniqueName: \"kubernetes.io/projected/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-kube-api-access-pxsxk\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.121364 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.328736 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9ws68" event={"ID":"4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28","Type":"ContainerDied","Data":"722d68ceac1f97c959fb1d67ab4d13575be7102fd3836858ba1f7126552e450e"} Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.328771 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="722d68ceac1f97c959fb1d67ab4d13575be7102fd3836858ba1f7126552e450e" Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.328793 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9ws68" Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.330323 4937 generic.go:334] "Generic (PLEG): container finished" podID="8a2ea03f-b76a-4775-b4ee-827cc43744c1" containerID="818b0815324ce9f2eb9cf06dc48a11c257685002beb36cff09bf2a15122fa9df" exitCode=0 Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.331331 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4sssx" event={"ID":"8a2ea03f-b76a-4775-b4ee-827cc43744c1","Type":"ContainerDied","Data":"818b0815324ce9f2eb9cf06dc48a11c257685002beb36cff09bf2a15122fa9df"} Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.418829 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.419160 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="db31ec58-43e7-4625-9551-84405e20c3f5" containerName="nova-api-log" containerID="cri-o://2803d653b814fdf03abaca60765f621f6c161783bcd9ca50898875c9d5857eec" gracePeriod=30 Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.419868 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="db31ec58-43e7-4625-9551-84405e20c3f5" containerName="nova-api-api" containerID="cri-o://9a434b3980ce99761289913c0f390dea1b1b6729bebdc1de58d5004c3737f823" gracePeriod=30 Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.442283 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.460397 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.460763 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e0b1f91f-d0f3-450e-aaf4-246815fcade3" containerName="nova-metadata-log" containerID="cri-o://dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542" gracePeriod=30 Feb 25 16:14:50 crc kubenswrapper[4937]: I0225 16:14:50.461378 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e0b1f91f-d0f3-450e-aaf4-246815fcade3" containerName="nova-metadata-metadata" containerID="cri-o://fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65" gracePeriod=30 Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.259318 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.353748 4937 generic.go:334] "Generic (PLEG): container finished" podID="db31ec58-43e7-4625-9551-84405e20c3f5" containerID="2803d653b814fdf03abaca60765f621f6c161783bcd9ca50898875c9d5857eec" exitCode=143 Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.353820 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db31ec58-43e7-4625-9551-84405e20c3f5","Type":"ContainerDied","Data":"2803d653b814fdf03abaca60765f621f6c161783bcd9ca50898875c9d5857eec"} Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.363179 4937 generic.go:334] "Generic (PLEG): container finished" podID="e0b1f91f-d0f3-450e-aaf4-246815fcade3" containerID="fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65" exitCode=0 Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.363208 4937 generic.go:334] "Generic (PLEG): container finished" podID="e0b1f91f-d0f3-450e-aaf4-246815fcade3" containerID="dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542" exitCode=143 Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.363236 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.363279 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0b1f91f-d0f3-450e-aaf4-246815fcade3","Type":"ContainerDied","Data":"fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65"} Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.363334 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0b1f91f-d0f3-450e-aaf4-246815fcade3","Type":"ContainerDied","Data":"dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542"} Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.363347 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0b1f91f-d0f3-450e-aaf4-246815fcade3","Type":"ContainerDied","Data":"4402b8d878fdbc2494fe4072250b0f4236bb83dd5ff2a45298dfad09dbd1075c"} Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.363365 4937 scope.go:117] "RemoveContainer" containerID="fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.425386 4937 scope.go:117] "RemoveContainer" containerID="dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.436566 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cfc3af1-e6ed-4ac5-b539-822aecc38181" path="/var/lib/kubelet/pods/4cfc3af1-e6ed-4ac5-b539-822aecc38181/volumes" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.456502 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b1f91f-d0f3-450e-aaf4-246815fcade3-logs\") pod \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.456589 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svtzd\" (UniqueName: \"kubernetes.io/projected/e0b1f91f-d0f3-450e-aaf4-246815fcade3-kube-api-access-svtzd\") pod \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.456841 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-nova-metadata-tls-certs\") pod \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.456879 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-config-data\") pod \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.456958 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-combined-ca-bundle\") pod \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\" (UID: \"e0b1f91f-d0f3-450e-aaf4-246815fcade3\") " Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.462353 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b1f91f-d0f3-450e-aaf4-246815fcade3-logs" (OuterVolumeSpecName: "logs") pod "e0b1f91f-d0f3-450e-aaf4-246815fcade3" (UID: "e0b1f91f-d0f3-450e-aaf4-246815fcade3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.468361 4937 scope.go:117] "RemoveContainer" containerID="fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65" Feb 25 16:14:51 crc kubenswrapper[4937]: E0225 16:14:51.469745 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65\": container with ID starting with fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65 not found: ID does not exist" containerID="fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.469785 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65"} err="failed to get container status \"fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65\": rpc error: code = NotFound desc = could not find container \"fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65\": container with ID starting with fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65 not found: ID does not exist" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.469816 4937 scope.go:117] "RemoveContainer" containerID="dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542" Feb 25 16:14:51 crc kubenswrapper[4937]: E0225 16:14:51.470879 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542\": container with ID starting with dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542 not found: ID does not exist" containerID="dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.470900 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542"} err="failed to get container status \"dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542\": rpc error: code = NotFound desc = could not find container \"dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542\": container with ID starting with dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542 not found: ID does not exist" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.470914 4937 scope.go:117] "RemoveContainer" containerID="fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.471537 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65"} err="failed to get container status \"fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65\": rpc error: code = NotFound desc = could not find container \"fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65\": container with ID starting with fcc72dc7fce58220e33990475c361638d9d6971156dd0e7bba48dab847eb1b65 not found: ID does not exist" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.471554 4937 scope.go:117] "RemoveContainer" containerID="dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.472535 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542"} err="failed to get container status \"dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542\": rpc error: code = NotFound desc = could not find container \"dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542\": container with ID starting with dc48106c1cf0fd297397d509f3ce6c242f8671cc642c67a209463305219bf542 not found: ID does not exist" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.494950 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b1f91f-d0f3-450e-aaf4-246815fcade3-kube-api-access-svtzd" (OuterVolumeSpecName: "kube-api-access-svtzd") pod "e0b1f91f-d0f3-450e-aaf4-246815fcade3" (UID: "e0b1f91f-d0f3-450e-aaf4-246815fcade3"). InnerVolumeSpecName "kube-api-access-svtzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.495508 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-config-data" (OuterVolumeSpecName: "config-data") pod "e0b1f91f-d0f3-450e-aaf4-246815fcade3" (UID: "e0b1f91f-d0f3-450e-aaf4-246815fcade3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.500102 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0b1f91f-d0f3-450e-aaf4-246815fcade3" (UID: "e0b1f91f-d0f3-450e-aaf4-246815fcade3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.557608 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e0b1f91f-d0f3-450e-aaf4-246815fcade3" (UID: "e0b1f91f-d0f3-450e-aaf4-246815fcade3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.562285 4937 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.562326 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.562336 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b1f91f-d0f3-450e-aaf4-246815fcade3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.562347 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b1f91f-d0f3-450e-aaf4-246815fcade3-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.562357 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svtzd\" (UniqueName: \"kubernetes.io/projected/e0b1f91f-d0f3-450e-aaf4-246815fcade3-kube-api-access-svtzd\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.712561 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.731924 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.741822 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:51 crc kubenswrapper[4937]: E0225 16:14:51.743138 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b1f91f-d0f3-450e-aaf4-246815fcade3" containerName="nova-metadata-log" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.743158 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b1f91f-d0f3-450e-aaf4-246815fcade3" containerName="nova-metadata-log" Feb 25 16:14:51 crc kubenswrapper[4937]: E0225 16:14:51.743173 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cfc3af1-e6ed-4ac5-b539-822aecc38181" containerName="dnsmasq-dns" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.743180 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cfc3af1-e6ed-4ac5-b539-822aecc38181" containerName="dnsmasq-dns" Feb 25 16:14:51 crc kubenswrapper[4937]: E0225 16:14:51.743200 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cfc3af1-e6ed-4ac5-b539-822aecc38181" containerName="init" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.743206 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cfc3af1-e6ed-4ac5-b539-822aecc38181" containerName="init" Feb 25 16:14:51 crc kubenswrapper[4937]: E0225 16:14:51.743217 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b1f91f-d0f3-450e-aaf4-246815fcade3" containerName="nova-metadata-metadata" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.743223 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b1f91f-d0f3-450e-aaf4-246815fcade3" containerName="nova-metadata-metadata" Feb 25 16:14:51 crc kubenswrapper[4937]: E0225 16:14:51.743237 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28" containerName="nova-manage" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.743242 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28" containerName="nova-manage" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.743540 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b1f91f-d0f3-450e-aaf4-246815fcade3" containerName="nova-metadata-log" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.743559 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28" containerName="nova-manage" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.743575 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cfc3af1-e6ed-4ac5-b539-822aecc38181" containerName="dnsmasq-dns" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.743586 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b1f91f-d0f3-450e-aaf4-246815fcade3" containerName="nova-metadata-metadata" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.744931 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.750099 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.750364 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.754991 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.874673 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.874713 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-config-data\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.874750 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.874767 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69898463-2d64-46cb-8d7a-ff187bb8b0a1-logs\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.874941 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5tc7\" (UniqueName: \"kubernetes.io/projected/69898463-2d64-46cb-8d7a-ff187bb8b0a1-kube-api-access-m5tc7\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.875625 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.976377 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-combined-ca-bundle\") pod \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.976599 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-config-data\") pod \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.976760 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md7t9\" (UniqueName: \"kubernetes.io/projected/8a2ea03f-b76a-4775-b4ee-827cc43744c1-kube-api-access-md7t9\") pod \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.976832 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-scripts\") pod \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\" (UID: \"8a2ea03f-b76a-4775-b4ee-827cc43744c1\") " Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.977439 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5tc7\" (UniqueName: \"kubernetes.io/projected/69898463-2d64-46cb-8d7a-ff187bb8b0a1-kube-api-access-m5tc7\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.977649 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.977769 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-config-data\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.978595 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.978709 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69898463-2d64-46cb-8d7a-ff187bb8b0a1-logs\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.979347 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69898463-2d64-46cb-8d7a-ff187bb8b0a1-logs\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.981085 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-scripts" (OuterVolumeSpecName: "scripts") pod "8a2ea03f-b76a-4775-b4ee-827cc43744c1" (UID: "8a2ea03f-b76a-4775-b4ee-827cc43744c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.981918 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a2ea03f-b76a-4775-b4ee-827cc43744c1-kube-api-access-md7t9" (OuterVolumeSpecName: "kube-api-access-md7t9") pod "8a2ea03f-b76a-4775-b4ee-827cc43744c1" (UID: "8a2ea03f-b76a-4775-b4ee-827cc43744c1"). InnerVolumeSpecName "kube-api-access-md7t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.982145 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.982276 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-config-data\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.983203 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:51 crc kubenswrapper[4937]: I0225 16:14:51.997084 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5tc7\" (UniqueName: \"kubernetes.io/projected/69898463-2d64-46cb-8d7a-ff187bb8b0a1-kube-api-access-m5tc7\") pod \"nova-metadata-0\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " pod="openstack/nova-metadata-0" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.010868 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-config-data" (OuterVolumeSpecName: "config-data") pod "8a2ea03f-b76a-4775-b4ee-827cc43744c1" (UID: "8a2ea03f-b76a-4775-b4ee-827cc43744c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.012821 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a2ea03f-b76a-4775-b4ee-827cc43744c1" (UID: "8a2ea03f-b76a-4775-b4ee-827cc43744c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.066472 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.081378 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.081445 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md7t9\" (UniqueName: \"kubernetes.io/projected/8a2ea03f-b76a-4775-b4ee-827cc43744c1-kube-api-access-md7t9\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.081474 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.081536 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a2ea03f-b76a-4775-b4ee-827cc43744c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.380540 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4sssx" event={"ID":"8a2ea03f-b76a-4775-b4ee-827cc43744c1","Type":"ContainerDied","Data":"67709ca9aa59f7bbcd054be88a08866a57a2217869db089270bc1a4e6627a173"} Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.380601 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67709ca9aa59f7bbcd054be88a08866a57a2217869db089270bc1a4e6627a173" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.380553 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4sssx" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.383104 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1670d95c-7850-442b-8b88-4b1d20464736" containerName="nova-scheduler-scheduler" containerID="cri-o://9f107e3aff32b2d3d5ee0344274edcfd799064835c2aaa615956b5230d30a2e9" gracePeriod=30 Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.432607 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 25 16:14:52 crc kubenswrapper[4937]: E0225 16:14:52.438463 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2ea03f-b76a-4775-b4ee-827cc43744c1" containerName="nova-cell1-conductor-db-sync" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.438517 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2ea03f-b76a-4775-b4ee-827cc43744c1" containerName="nova-cell1-conductor-db-sync" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.438791 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a2ea03f-b76a-4775-b4ee-827cc43744c1" containerName="nova-cell1-conductor-db-sync" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.439785 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.452896 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.464995 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.559455 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.592085 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95d4ab5-6b4e-477f-848e-0b98b93c8ba1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f95d4ab5-6b4e-477f-848e-0b98b93c8ba1\") " pod="openstack/nova-cell1-conductor-0" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.592203 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trf4n\" (UniqueName: \"kubernetes.io/projected/f95d4ab5-6b4e-477f-848e-0b98b93c8ba1-kube-api-access-trf4n\") pod \"nova-cell1-conductor-0\" (UID: \"f95d4ab5-6b4e-477f-848e-0b98b93c8ba1\") " pod="openstack/nova-cell1-conductor-0" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.592255 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95d4ab5-6b4e-477f-848e-0b98b93c8ba1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f95d4ab5-6b4e-477f-848e-0b98b93c8ba1\") " pod="openstack/nova-cell1-conductor-0" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.694091 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trf4n\" (UniqueName: \"kubernetes.io/projected/f95d4ab5-6b4e-477f-848e-0b98b93c8ba1-kube-api-access-trf4n\") pod \"nova-cell1-conductor-0\" (UID: \"f95d4ab5-6b4e-477f-848e-0b98b93c8ba1\") " pod="openstack/nova-cell1-conductor-0" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.694162 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95d4ab5-6b4e-477f-848e-0b98b93c8ba1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f95d4ab5-6b4e-477f-848e-0b98b93c8ba1\") " pod="openstack/nova-cell1-conductor-0" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.694303 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95d4ab5-6b4e-477f-848e-0b98b93c8ba1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f95d4ab5-6b4e-477f-848e-0b98b93c8ba1\") " pod="openstack/nova-cell1-conductor-0" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.705455 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95d4ab5-6b4e-477f-848e-0b98b93c8ba1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f95d4ab5-6b4e-477f-848e-0b98b93c8ba1\") " pod="openstack/nova-cell1-conductor-0" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.705532 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95d4ab5-6b4e-477f-848e-0b98b93c8ba1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f95d4ab5-6b4e-477f-848e-0b98b93c8ba1\") " pod="openstack/nova-cell1-conductor-0" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.716367 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trf4n\" (UniqueName: \"kubernetes.io/projected/f95d4ab5-6b4e-477f-848e-0b98b93c8ba1-kube-api-access-trf4n\") pod \"nova-cell1-conductor-0\" (UID: \"f95d4ab5-6b4e-477f-848e-0b98b93c8ba1\") " pod="openstack/nova-cell1-conductor-0" Feb 25 16:14:52 crc kubenswrapper[4937]: I0225 16:14:52.784879 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 25 16:14:53 crc kubenswrapper[4937]: E0225 16:14:53.161810 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f107e3aff32b2d3d5ee0344274edcfd799064835c2aaa615956b5230d30a2e9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 25 16:14:53 crc kubenswrapper[4937]: E0225 16:14:53.163815 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f107e3aff32b2d3d5ee0344274edcfd799064835c2aaa615956b5230d30a2e9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 25 16:14:53 crc kubenswrapper[4937]: E0225 16:14:53.166764 4937 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f107e3aff32b2d3d5ee0344274edcfd799064835c2aaa615956b5230d30a2e9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 25 16:14:53 crc kubenswrapper[4937]: E0225 16:14:53.166845 4937 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1670d95c-7850-442b-8b88-4b1d20464736" containerName="nova-scheduler-scheduler" Feb 25 16:14:53 crc kubenswrapper[4937]: I0225 16:14:53.261875 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 25 16:14:53 crc kubenswrapper[4937]: I0225 16:14:53.269432 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:14:53 crc kubenswrapper[4937]: I0225 16:14:53.269530 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:14:53 crc kubenswrapper[4937]: I0225 16:14:53.381627 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0b1f91f-d0f3-450e-aaf4-246815fcade3" path="/var/lib/kubelet/pods/e0b1f91f-d0f3-450e-aaf4-246815fcade3/volumes" Feb 25 16:14:53 crc kubenswrapper[4937]: I0225 16:14:53.394821 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f95d4ab5-6b4e-477f-848e-0b98b93c8ba1","Type":"ContainerStarted","Data":"a183c183a22c708051b151c76a9c3563ad1071c709c952987dde5cc6e2b3384b"} Feb 25 16:14:53 crc kubenswrapper[4937]: I0225 16:14:53.400007 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69898463-2d64-46cb-8d7a-ff187bb8b0a1","Type":"ContainerStarted","Data":"4c5f94129e556d2f5cb898836f8defef125f08ff7b6c9b2bdba3438fc000d9eb"} Feb 25 16:14:53 crc kubenswrapper[4937]: I0225 16:14:53.400057 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69898463-2d64-46cb-8d7a-ff187bb8b0a1","Type":"ContainerStarted","Data":"6ca7c5615756355856ccb79658ed2df5e622cdfb86e80a8345524cb17206fcb1"} Feb 25 16:14:53 crc kubenswrapper[4937]: I0225 16:14:53.400073 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69898463-2d64-46cb-8d7a-ff187bb8b0a1","Type":"ContainerStarted","Data":"bef984373fc8c782dab8ea82cc314d06825e0005473442a64da5691278e51ffc"} Feb 25 16:14:53 crc kubenswrapper[4937]: I0225 16:14:53.424187 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.424167885 podStartE2EDuration="2.424167885s" podCreationTimestamp="2026-02-25 16:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:14:53.417761474 +0000 UTC m=+1744.431153374" watchObservedRunningTime="2026-02-25 16:14:53.424167885 +0000 UTC m=+1744.437559775" Feb 25 16:14:54 crc kubenswrapper[4937]: I0225 16:14:54.346017 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-2vtpf" podUID="2462d0bc-6986-4337-a05a-863a45a45393" containerName="registry-server" probeResult="failure" output=< Feb 25 16:14:54 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 16:14:54 crc kubenswrapper[4937]: > Feb 25 16:14:54 crc kubenswrapper[4937]: I0225 16:14:54.416290 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f95d4ab5-6b4e-477f-848e-0b98b93c8ba1","Type":"ContainerStarted","Data":"17b3351c0c6aa7ba407ce7d16720199185b011673f86c050e9fe77dc7c1f43ae"} Feb 25 16:14:54 crc kubenswrapper[4937]: I0225 16:14:54.418208 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 25 16:14:54 crc kubenswrapper[4937]: I0225 16:14:54.444075 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.444054112 podStartE2EDuration="2.444054112s" podCreationTimestamp="2026-02-25 16:14:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:14:54.43318901 +0000 UTC m=+1745.446580910" watchObservedRunningTime="2026-02-25 16:14:54.444054112 +0000 UTC m=+1745.457446002" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.237915 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rpnbs"] Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.240719 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.255240 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rpnbs"] Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.363283 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6bk5\" (UniqueName: \"kubernetes.io/projected/4f500ae5-7183-44f5-ba92-08705866baf2-kube-api-access-j6bk5\") pod \"redhat-marketplace-rpnbs\" (UID: \"4f500ae5-7183-44f5-ba92-08705866baf2\") " pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.363351 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f500ae5-7183-44f5-ba92-08705866baf2-catalog-content\") pod \"redhat-marketplace-rpnbs\" (UID: \"4f500ae5-7183-44f5-ba92-08705866baf2\") " pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.363817 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f500ae5-7183-44f5-ba92-08705866baf2-utilities\") pod \"redhat-marketplace-rpnbs\" (UID: \"4f500ae5-7183-44f5-ba92-08705866baf2\") " pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.442613 4937 generic.go:334] "Generic (PLEG): container finished" podID="db31ec58-43e7-4625-9551-84405e20c3f5" containerID="9a434b3980ce99761289913c0f390dea1b1b6729bebdc1de58d5004c3737f823" exitCode=0 Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.443527 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db31ec58-43e7-4625-9551-84405e20c3f5","Type":"ContainerDied","Data":"9a434b3980ce99761289913c0f390dea1b1b6729bebdc1de58d5004c3737f823"} Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.466932 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f500ae5-7183-44f5-ba92-08705866baf2-utilities\") pod \"redhat-marketplace-rpnbs\" (UID: \"4f500ae5-7183-44f5-ba92-08705866baf2\") " pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.467140 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6bk5\" (UniqueName: \"kubernetes.io/projected/4f500ae5-7183-44f5-ba92-08705866baf2-kube-api-access-j6bk5\") pod \"redhat-marketplace-rpnbs\" (UID: \"4f500ae5-7183-44f5-ba92-08705866baf2\") " pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.467188 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f500ae5-7183-44f5-ba92-08705866baf2-catalog-content\") pod \"redhat-marketplace-rpnbs\" (UID: \"4f500ae5-7183-44f5-ba92-08705866baf2\") " pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.468702 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f500ae5-7183-44f5-ba92-08705866baf2-utilities\") pod \"redhat-marketplace-rpnbs\" (UID: \"4f500ae5-7183-44f5-ba92-08705866baf2\") " pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.469638 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f500ae5-7183-44f5-ba92-08705866baf2-catalog-content\") pod \"redhat-marketplace-rpnbs\" (UID: \"4f500ae5-7183-44f5-ba92-08705866baf2\") " pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.495297 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6bk5\" (UniqueName: \"kubernetes.io/projected/4f500ae5-7183-44f5-ba92-08705866baf2-kube-api-access-j6bk5\") pod \"redhat-marketplace-rpnbs\" (UID: \"4f500ae5-7183-44f5-ba92-08705866baf2\") " pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.546945 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.567707 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.568886 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db31ec58-43e7-4625-9551-84405e20c3f5-combined-ca-bundle\") pod \"db31ec58-43e7-4625-9551-84405e20c3f5\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.568970 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db31ec58-43e7-4625-9551-84405e20c3f5-logs\") pod \"db31ec58-43e7-4625-9551-84405e20c3f5\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.569007 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db31ec58-43e7-4625-9551-84405e20c3f5-config-data\") pod \"db31ec58-43e7-4625-9551-84405e20c3f5\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.569050 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fmt9\" (UniqueName: \"kubernetes.io/projected/db31ec58-43e7-4625-9551-84405e20c3f5-kube-api-access-4fmt9\") pod \"db31ec58-43e7-4625-9551-84405e20c3f5\" (UID: \"db31ec58-43e7-4625-9551-84405e20c3f5\") " Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.571074 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db31ec58-43e7-4625-9551-84405e20c3f5-logs" (OuterVolumeSpecName: "logs") pod "db31ec58-43e7-4625-9551-84405e20c3f5" (UID: "db31ec58-43e7-4625-9551-84405e20c3f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.574276 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db31ec58-43e7-4625-9551-84405e20c3f5-kube-api-access-4fmt9" (OuterVolumeSpecName: "kube-api-access-4fmt9") pod "db31ec58-43e7-4625-9551-84405e20c3f5" (UID: "db31ec58-43e7-4625-9551-84405e20c3f5"). InnerVolumeSpecName "kube-api-access-4fmt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.625020 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db31ec58-43e7-4625-9551-84405e20c3f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db31ec58-43e7-4625-9551-84405e20c3f5" (UID: "db31ec58-43e7-4625-9551-84405e20c3f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.638701 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db31ec58-43e7-4625-9551-84405e20c3f5-config-data" (OuterVolumeSpecName: "config-data") pod "db31ec58-43e7-4625-9551-84405e20c3f5" (UID: "db31ec58-43e7-4625-9551-84405e20c3f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.672399 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db31ec58-43e7-4625-9551-84405e20c3f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.672437 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db31ec58-43e7-4625-9551-84405e20c3f5-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.672446 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db31ec58-43e7-4625-9551-84405e20c3f5-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:55 crc kubenswrapper[4937]: I0225 16:14:55.672455 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fmt9\" (UniqueName: \"kubernetes.io/projected/db31ec58-43e7-4625-9551-84405e20c3f5-kube-api-access-4fmt9\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.079127 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rpnbs"] Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.456300 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db31ec58-43e7-4625-9551-84405e20c3f5","Type":"ContainerDied","Data":"53baa6ef8a7caa4c908549a7faca413bbce3c7739779f90969f6a27464698f98"} Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.456594 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.457651 4937 scope.go:117] "RemoveContainer" containerID="9a434b3980ce99761289913c0f390dea1b1b6729bebdc1de58d5004c3737f823" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.458370 4937 generic.go:334] "Generic (PLEG): container finished" podID="4f500ae5-7183-44f5-ba92-08705866baf2" containerID="94525afcd9ad485a9dec3066d6ce8188a8c1f567f10b9428d00d9fabcbd82c23" exitCode=0 Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.458514 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpnbs" event={"ID":"4f500ae5-7183-44f5-ba92-08705866baf2","Type":"ContainerDied","Data":"94525afcd9ad485a9dec3066d6ce8188a8c1f567f10b9428d00d9fabcbd82c23"} Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.458544 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpnbs" event={"ID":"4f500ae5-7183-44f5-ba92-08705866baf2","Type":"ContainerStarted","Data":"0b475e2c3f2a9fe1dba1d237298ca39f709d2a83619cd84fc528132f669d5f62"} Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.464338 4937 generic.go:334] "Generic (PLEG): container finished" podID="1670d95c-7850-442b-8b88-4b1d20464736" containerID="9f107e3aff32b2d3d5ee0344274edcfd799064835c2aaa615956b5230d30a2e9" exitCode=0 Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.465251 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1670d95c-7850-442b-8b88-4b1d20464736","Type":"ContainerDied","Data":"9f107e3aff32b2d3d5ee0344274edcfd799064835c2aaa615956b5230d30a2e9"} Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.490093 4937 scope.go:117] "RemoveContainer" containerID="2803d653b814fdf03abaca60765f621f6c161783bcd9ca50898875c9d5857eec" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.512355 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.528720 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.539443 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 25 16:14:56 crc kubenswrapper[4937]: E0225 16:14:56.544818 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db31ec58-43e7-4625-9551-84405e20c3f5" containerName="nova-api-log" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.544850 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="db31ec58-43e7-4625-9551-84405e20c3f5" containerName="nova-api-log" Feb 25 16:14:56 crc kubenswrapper[4937]: E0225 16:14:56.544884 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db31ec58-43e7-4625-9551-84405e20c3f5" containerName="nova-api-api" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.544891 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="db31ec58-43e7-4625-9551-84405e20c3f5" containerName="nova-api-api" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.545105 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="db31ec58-43e7-4625-9551-84405e20c3f5" containerName="nova-api-api" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.545123 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="db31ec58-43e7-4625-9551-84405e20c3f5" containerName="nova-api-log" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.546374 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.560934 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.584327 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.619970 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.701177 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " pod="openstack/nova-api-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.701508 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-logs\") pod \"nova-api-0\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " pod="openstack/nova-api-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.701751 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6vrc\" (UniqueName: \"kubernetes.io/projected/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-kube-api-access-n6vrc\") pod \"nova-api-0\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " pod="openstack/nova-api-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.701944 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-config-data\") pod \"nova-api-0\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " pod="openstack/nova-api-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.803337 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1670d95c-7850-442b-8b88-4b1d20464736-combined-ca-bundle\") pod \"1670d95c-7850-442b-8b88-4b1d20464736\" (UID: \"1670d95c-7850-442b-8b88-4b1d20464736\") " Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.803593 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxlnd\" (UniqueName: \"kubernetes.io/projected/1670d95c-7850-442b-8b88-4b1d20464736-kube-api-access-gxlnd\") pod \"1670d95c-7850-442b-8b88-4b1d20464736\" (UID: \"1670d95c-7850-442b-8b88-4b1d20464736\") " Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.803911 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1670d95c-7850-442b-8b88-4b1d20464736-config-data\") pod \"1670d95c-7850-442b-8b88-4b1d20464736\" (UID: \"1670d95c-7850-442b-8b88-4b1d20464736\") " Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.804410 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-logs\") pod \"nova-api-0\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " pod="openstack/nova-api-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.804709 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6vrc\" (UniqueName: \"kubernetes.io/projected/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-kube-api-access-n6vrc\") pod \"nova-api-0\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " pod="openstack/nova-api-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.804831 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-config-data\") pod \"nova-api-0\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " pod="openstack/nova-api-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.804953 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " pod="openstack/nova-api-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.804966 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-logs\") pod \"nova-api-0\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " pod="openstack/nova-api-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.809373 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " pod="openstack/nova-api-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.809433 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1670d95c-7850-442b-8b88-4b1d20464736-kube-api-access-gxlnd" (OuterVolumeSpecName: "kube-api-access-gxlnd") pod "1670d95c-7850-442b-8b88-4b1d20464736" (UID: "1670d95c-7850-442b-8b88-4b1d20464736"). InnerVolumeSpecName "kube-api-access-gxlnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.809517 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-config-data\") pod \"nova-api-0\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " pod="openstack/nova-api-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.821345 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6vrc\" (UniqueName: \"kubernetes.io/projected/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-kube-api-access-n6vrc\") pod \"nova-api-0\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " pod="openstack/nova-api-0" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.833380 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1670d95c-7850-442b-8b88-4b1d20464736-config-data" (OuterVolumeSpecName: "config-data") pod "1670d95c-7850-442b-8b88-4b1d20464736" (UID: "1670d95c-7850-442b-8b88-4b1d20464736"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.833889 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1670d95c-7850-442b-8b88-4b1d20464736-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1670d95c-7850-442b-8b88-4b1d20464736" (UID: "1670d95c-7850-442b-8b88-4b1d20464736"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.907341 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1670d95c-7850-442b-8b88-4b1d20464736-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.907368 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxlnd\" (UniqueName: \"kubernetes.io/projected/1670d95c-7850-442b-8b88-4b1d20464736-kube-api-access-gxlnd\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.907380 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1670d95c-7850-442b-8b88-4b1d20464736-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:14:56 crc kubenswrapper[4937]: I0225 16:14:56.935203 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.066847 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.067230 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.379168 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db31ec58-43e7-4625-9551-84405e20c3f5" path="/var/lib/kubelet/pods/db31ec58-43e7-4625-9551-84405e20c3f5/volumes" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.461130 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:14:57 crc kubenswrapper[4937]: W0225 16:14:57.464421 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dc3b1d2_cd5c_417b_b62f_e5fbc2473497.slice/crio-c81d36e436d352ee6de42e85b8668cb283e3765aaaa25664c07a2365f5283873 WatchSource:0}: Error finding container c81d36e436d352ee6de42e85b8668cb283e3765aaaa25664c07a2365f5283873: Status 404 returned error can't find the container with id c81d36e436d352ee6de42e85b8668cb283e3765aaaa25664c07a2365f5283873 Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.475518 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497","Type":"ContainerStarted","Data":"c81d36e436d352ee6de42e85b8668cb283e3765aaaa25664c07a2365f5283873"} Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.487889 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpnbs" event={"ID":"4f500ae5-7183-44f5-ba92-08705866baf2","Type":"ContainerStarted","Data":"f6247a24d8e0d89e30289f868db570590ecc6ea28beb1125e427ab2b7465f21a"} Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.494643 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1670d95c-7850-442b-8b88-4b1d20464736","Type":"ContainerDied","Data":"955d55f43a2037fb0574b21c5a7c388b8880345a6114c1ef9bfb33d0e1fcc584"} Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.494681 4937 scope.go:117] "RemoveContainer" containerID="9f107e3aff32b2d3d5ee0344274edcfd799064835c2aaa615956b5230d30a2e9" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.494791 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.544527 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.561553 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.596145 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:14:57 crc kubenswrapper[4937]: E0225 16:14:57.596821 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1670d95c-7850-442b-8b88-4b1d20464736" containerName="nova-scheduler-scheduler" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.596845 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="1670d95c-7850-442b-8b88-4b1d20464736" containerName="nova-scheduler-scheduler" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.597129 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="1670d95c-7850-442b-8b88-4b1d20464736" containerName="nova-scheduler-scheduler" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.598231 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.605295 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.608141 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.730614 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8408967e-27f5-424c-9a30-be0b1b30812b\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.731133 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69gpn\" (UniqueName: \"kubernetes.io/projected/8408967e-27f5-424c-9a30-be0b1b30812b-kube-api-access-69gpn\") pod \"nova-scheduler-0\" (UID: \"8408967e-27f5-424c-9a30-be0b1b30812b\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.731231 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-config-data\") pod \"nova-scheduler-0\" (UID: \"8408967e-27f5-424c-9a30-be0b1b30812b\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.790903 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.834763 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8408967e-27f5-424c-9a30-be0b1b30812b\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.834914 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69gpn\" (UniqueName: \"kubernetes.io/projected/8408967e-27f5-424c-9a30-be0b1b30812b-kube-api-access-69gpn\") pod \"nova-scheduler-0\" (UID: \"8408967e-27f5-424c-9a30-be0b1b30812b\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.834982 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-config-data\") pod \"nova-scheduler-0\" (UID: \"8408967e-27f5-424c-9a30-be0b1b30812b\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.847672 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-config-data\") pod \"nova-scheduler-0\" (UID: \"8408967e-27f5-424c-9a30-be0b1b30812b\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.851138 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8408967e-27f5-424c-9a30-be0b1b30812b\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.867534 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69gpn\" (UniqueName: \"kubernetes.io/projected/8408967e-27f5-424c-9a30-be0b1b30812b-kube-api-access-69gpn\") pod \"nova-scheduler-0\" (UID: \"8408967e-27f5-424c-9a30-be0b1b30812b\") " pod="openstack/nova-scheduler-0" Feb 25 16:14:57 crc kubenswrapper[4937]: I0225 16:14:57.934728 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 16:14:58 crc kubenswrapper[4937]: I0225 16:14:58.434238 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:14:58 crc kubenswrapper[4937]: W0225 16:14:58.449727 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8408967e_27f5_424c_9a30_be0b1b30812b.slice/crio-d839301cbaab0a22faca8522bec57f66133ce7326e2a4a3ff9a6088fdc4cde3b WatchSource:0}: Error finding container d839301cbaab0a22faca8522bec57f66133ce7326e2a4a3ff9a6088fdc4cde3b: Status 404 returned error can't find the container with id d839301cbaab0a22faca8522bec57f66133ce7326e2a4a3ff9a6088fdc4cde3b Feb 25 16:14:58 crc kubenswrapper[4937]: I0225 16:14:58.506802 4937 generic.go:334] "Generic (PLEG): container finished" podID="4f500ae5-7183-44f5-ba92-08705866baf2" containerID="f6247a24d8e0d89e30289f868db570590ecc6ea28beb1125e427ab2b7465f21a" exitCode=0 Feb 25 16:14:58 crc kubenswrapper[4937]: I0225 16:14:58.506909 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpnbs" event={"ID":"4f500ae5-7183-44f5-ba92-08705866baf2","Type":"ContainerDied","Data":"f6247a24d8e0d89e30289f868db570590ecc6ea28beb1125e427ab2b7465f21a"} Feb 25 16:14:58 crc kubenswrapper[4937]: I0225 16:14:58.515290 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497","Type":"ContainerStarted","Data":"eebeb71401ee8dbae25a2c2de0f73b998b7d8939400b35d488f079c202ce1e76"} Feb 25 16:14:58 crc kubenswrapper[4937]: I0225 16:14:58.515341 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497","Type":"ContainerStarted","Data":"af41b91a5c20d45867b8bedc7d52fbc739737475d33b94bc2b7b07c71264348b"} Feb 25 16:14:58 crc kubenswrapper[4937]: I0225 16:14:58.517040 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8408967e-27f5-424c-9a30-be0b1b30812b","Type":"ContainerStarted","Data":"d839301cbaab0a22faca8522bec57f66133ce7326e2a4a3ff9a6088fdc4cde3b"} Feb 25 16:14:58 crc kubenswrapper[4937]: I0225 16:14:58.548171 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.548153858 podStartE2EDuration="2.548153858s" podCreationTimestamp="2026-02-25 16:14:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:14:58.546850175 +0000 UTC m=+1749.560242065" watchObservedRunningTime="2026-02-25 16:14:58.548153858 +0000 UTC m=+1749.561545748" Feb 25 16:14:59 crc kubenswrapper[4937]: I0225 16:14:59.384153 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1670d95c-7850-442b-8b88-4b1d20464736" path="/var/lib/kubelet/pods/1670d95c-7850-442b-8b88-4b1d20464736/volumes" Feb 25 16:14:59 crc kubenswrapper[4937]: I0225 16:14:59.528853 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8408967e-27f5-424c-9a30-be0b1b30812b","Type":"ContainerStarted","Data":"0877eab7cd85879652c63807278dfdbd811256eab8030a4026359b084e89a79e"} Feb 25 16:14:59 crc kubenswrapper[4937]: I0225 16:14:59.531196 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpnbs" event={"ID":"4f500ae5-7183-44f5-ba92-08705866baf2","Type":"ContainerStarted","Data":"ad6b23cf8e3088029b0b7408619854bf0f7fcf480318c179ab82ba27f2a4ac89"} Feb 25 16:14:59 crc kubenswrapper[4937]: I0225 16:14:59.553156 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5531399219999997 podStartE2EDuration="2.553139922s" podCreationTimestamp="2026-02-25 16:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:14:59.545261474 +0000 UTC m=+1750.558653364" watchObservedRunningTime="2026-02-25 16:14:59.553139922 +0000 UTC m=+1750.566531812" Feb 25 16:14:59 crc kubenswrapper[4937]: I0225 16:14:59.572163 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rpnbs" podStartSLOduration=2.0925084959999998 podStartE2EDuration="4.572143038s" podCreationTimestamp="2026-02-25 16:14:55 +0000 UTC" firstStartedPulling="2026-02-25 16:14:56.461675352 +0000 UTC m=+1747.475067242" lastFinishedPulling="2026-02-25 16:14:58.941309904 +0000 UTC m=+1749.954701784" observedRunningTime="2026-02-25 16:14:59.564095286 +0000 UTC m=+1750.577487176" watchObservedRunningTime="2026-02-25 16:14:59.572143038 +0000 UTC m=+1750.585534928" Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.142817 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf"] Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.144639 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.149777 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.149927 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.160286 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf"] Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.294534 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7fd2813-1349-41f2-bcee-c2a6650cfafc-secret-volume\") pod \"collect-profiles-29533935-jmtbf\" (UID: \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.294854 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7fd2813-1349-41f2-bcee-c2a6650cfafc-config-volume\") pod \"collect-profiles-29533935-jmtbf\" (UID: \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.295145 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrw5z\" (UniqueName: \"kubernetes.io/projected/d7fd2813-1349-41f2-bcee-c2a6650cfafc-kube-api-access-nrw5z\") pod \"collect-profiles-29533935-jmtbf\" (UID: \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.397303 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7fd2813-1349-41f2-bcee-c2a6650cfafc-config-volume\") pod \"collect-profiles-29533935-jmtbf\" (UID: \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.397428 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrw5z\" (UniqueName: \"kubernetes.io/projected/d7fd2813-1349-41f2-bcee-c2a6650cfafc-kube-api-access-nrw5z\") pod \"collect-profiles-29533935-jmtbf\" (UID: \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.397538 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7fd2813-1349-41f2-bcee-c2a6650cfafc-secret-volume\") pod \"collect-profiles-29533935-jmtbf\" (UID: \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.398585 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7fd2813-1349-41f2-bcee-c2a6650cfafc-config-volume\") pod \"collect-profiles-29533935-jmtbf\" (UID: \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.415523 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7fd2813-1349-41f2-bcee-c2a6650cfafc-secret-volume\") pod \"collect-profiles-29533935-jmtbf\" (UID: \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.419632 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrw5z\" (UniqueName: \"kubernetes.io/projected/d7fd2813-1349-41f2-bcee-c2a6650cfafc-kube-api-access-nrw5z\") pod \"collect-profiles-29533935-jmtbf\" (UID: \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.460395 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" Feb 25 16:15:00 crc kubenswrapper[4937]: I0225 16:15:00.975935 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf"] Feb 25 16:15:00 crc kubenswrapper[4937]: W0225 16:15:00.978095 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7fd2813_1349_41f2_bcee_c2a6650cfafc.slice/crio-025c9d1cebc4abde4cbcb29133fb248c26bc035dda12b140cc1b76850383d1fc WatchSource:0}: Error finding container 025c9d1cebc4abde4cbcb29133fb248c26bc035dda12b140cc1b76850383d1fc: Status 404 returned error can't find the container with id 025c9d1cebc4abde4cbcb29133fb248c26bc035dda12b140cc1b76850383d1fc Feb 25 16:15:01 crc kubenswrapper[4937]: I0225 16:15:01.382728 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:15:01 crc kubenswrapper[4937]: E0225 16:15:01.383062 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:15:01 crc kubenswrapper[4937]: I0225 16:15:01.559415 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" event={"ID":"d7fd2813-1349-41f2-bcee-c2a6650cfafc","Type":"ContainerStarted","Data":"96156e3c2ea3d1cca434e94132995ab19cb05f3e731740baafbdaed496f5ca16"} Feb 25 16:15:01 crc kubenswrapper[4937]: I0225 16:15:01.559797 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" event={"ID":"d7fd2813-1349-41f2-bcee-c2a6650cfafc","Type":"ContainerStarted","Data":"025c9d1cebc4abde4cbcb29133fb248c26bc035dda12b140cc1b76850383d1fc"} Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.018550 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" podStartSLOduration=2.018534266 podStartE2EDuration="2.018534266s" podCreationTimestamp="2026-02-25 16:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:15:01.580902856 +0000 UTC m=+1752.594294746" watchObservedRunningTime="2026-02-25 16:15:02.018534266 +0000 UTC m=+1753.031926146" Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.025028 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.025232 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d34734ba-2195-4ea2-aa76-654c3c85a206" containerName="kube-state-metrics" containerID="cri-o://6ac683d3e2d86111c3d04b83d23a991feccf0736c9fcc99f088f3ac5b5fabf4a" gracePeriod=30 Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.066853 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.066931 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.583708 4937 generic.go:334] "Generic (PLEG): container finished" podID="d7fd2813-1349-41f2-bcee-c2a6650cfafc" containerID="96156e3c2ea3d1cca434e94132995ab19cb05f3e731740baafbdaed496f5ca16" exitCode=0 Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.583998 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" event={"ID":"d7fd2813-1349-41f2-bcee-c2a6650cfafc","Type":"ContainerDied","Data":"96156e3c2ea3d1cca434e94132995ab19cb05f3e731740baafbdaed496f5ca16"} Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.593194 4937 generic.go:334] "Generic (PLEG): container finished" podID="d34734ba-2195-4ea2-aa76-654c3c85a206" containerID="6ac683d3e2d86111c3d04b83d23a991feccf0736c9fcc99f088f3ac5b5fabf4a" exitCode=2 Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.593246 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d34734ba-2195-4ea2-aa76-654c3c85a206","Type":"ContainerDied","Data":"6ac683d3e2d86111c3d04b83d23a991feccf0736c9fcc99f088f3ac5b5fabf4a"} Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.731938 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.822463 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.847865 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw7gh\" (UniqueName: \"kubernetes.io/projected/d34734ba-2195-4ea2-aa76-654c3c85a206-kube-api-access-dw7gh\") pod \"d34734ba-2195-4ea2-aa76-654c3c85a206\" (UID: \"d34734ba-2195-4ea2-aa76-654c3c85a206\") " Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.855090 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34734ba-2195-4ea2-aa76-654c3c85a206-kube-api-access-dw7gh" (OuterVolumeSpecName: "kube-api-access-dw7gh") pod "d34734ba-2195-4ea2-aa76-654c3c85a206" (UID: "d34734ba-2195-4ea2-aa76-654c3c85a206"). InnerVolumeSpecName "kube-api-access-dw7gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.935946 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 25 16:15:02 crc kubenswrapper[4937]: I0225 16:15:02.951094 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw7gh\" (UniqueName: \"kubernetes.io/projected/d34734ba-2195-4ea2-aa76-654c3c85a206-kube-api-access-dw7gh\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.080696 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.229:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.081008 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.229:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.331821 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.390304 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.606977 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d34734ba-2195-4ea2-aa76-654c3c85a206","Type":"ContainerDied","Data":"13a26941777b61858e36029569e328b328b8267618d5b0cee4c672432c11c760"} Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.607067 4937 scope.go:117] "RemoveContainer" containerID="6ac683d3e2d86111c3d04b83d23a991feccf0736c9fcc99f088f3ac5b5fabf4a" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.607134 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.620052 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2vtpf"] Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.655957 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.671567 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.686556 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 16:15:03 crc kubenswrapper[4937]: E0225 16:15:03.687195 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34734ba-2195-4ea2-aa76-654c3c85a206" containerName="kube-state-metrics" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.687222 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34734ba-2195-4ea2-aa76-654c3c85a206" containerName="kube-state-metrics" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.687515 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34734ba-2195-4ea2-aa76-654c3c85a206" containerName="kube-state-metrics" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.689899 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.692627 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.693836 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.696873 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.880257 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcmn9\" (UniqueName: \"kubernetes.io/projected/b0312225-730b-46a3-8142-6a39e9d69f60-kube-api-access-vcmn9\") pod \"kube-state-metrics-0\" (UID: \"b0312225-730b-46a3-8142-6a39e9d69f60\") " pod="openstack/kube-state-metrics-0" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.880389 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0312225-730b-46a3-8142-6a39e9d69f60-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b0312225-730b-46a3-8142-6a39e9d69f60\") " pod="openstack/kube-state-metrics-0" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.880434 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0312225-730b-46a3-8142-6a39e9d69f60-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b0312225-730b-46a3-8142-6a39e9d69f60\") " pod="openstack/kube-state-metrics-0" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.880531 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b0312225-730b-46a3-8142-6a39e9d69f60-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b0312225-730b-46a3-8142-6a39e9d69f60\") " pod="openstack/kube-state-metrics-0" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.985448 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0312225-730b-46a3-8142-6a39e9d69f60-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b0312225-730b-46a3-8142-6a39e9d69f60\") " pod="openstack/kube-state-metrics-0" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.985694 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0312225-730b-46a3-8142-6a39e9d69f60-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b0312225-730b-46a3-8142-6a39e9d69f60\") " pod="openstack/kube-state-metrics-0" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.985939 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b0312225-730b-46a3-8142-6a39e9d69f60-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b0312225-730b-46a3-8142-6a39e9d69f60\") " pod="openstack/kube-state-metrics-0" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.986198 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcmn9\" (UniqueName: \"kubernetes.io/projected/b0312225-730b-46a3-8142-6a39e9d69f60-kube-api-access-vcmn9\") pod \"kube-state-metrics-0\" (UID: \"b0312225-730b-46a3-8142-6a39e9d69f60\") " pod="openstack/kube-state-metrics-0" Feb 25 16:15:03 crc kubenswrapper[4937]: I0225 16:15:03.992411 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b0312225-730b-46a3-8142-6a39e9d69f60-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b0312225-730b-46a3-8142-6a39e9d69f60\") " pod="openstack/kube-state-metrics-0" Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.009239 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0312225-730b-46a3-8142-6a39e9d69f60-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b0312225-730b-46a3-8142-6a39e9d69f60\") " pod="openstack/kube-state-metrics-0" Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.009825 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcmn9\" (UniqueName: \"kubernetes.io/projected/b0312225-730b-46a3-8142-6a39e9d69f60-kube-api-access-vcmn9\") pod \"kube-state-metrics-0\" (UID: \"b0312225-730b-46a3-8142-6a39e9d69f60\") " pod="openstack/kube-state-metrics-0" Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.030621 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0312225-730b-46a3-8142-6a39e9d69f60-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b0312225-730b-46a3-8142-6a39e9d69f60\") " pod="openstack/kube-state-metrics-0" Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.211253 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.292181 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrw5z\" (UniqueName: \"kubernetes.io/projected/d7fd2813-1349-41f2-bcee-c2a6650cfafc-kube-api-access-nrw5z\") pod \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\" (UID: \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\") " Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.292628 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7fd2813-1349-41f2-bcee-c2a6650cfafc-config-volume\") pod \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\" (UID: \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\") " Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.292743 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7fd2813-1349-41f2-bcee-c2a6650cfafc-secret-volume\") pod \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\" (UID: \"d7fd2813-1349-41f2-bcee-c2a6650cfafc\") " Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.293352 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7fd2813-1349-41f2-bcee-c2a6650cfafc-config-volume" (OuterVolumeSpecName: "config-volume") pod "d7fd2813-1349-41f2-bcee-c2a6650cfafc" (UID: "d7fd2813-1349-41f2-bcee-c2a6650cfafc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.300745 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7fd2813-1349-41f2-bcee-c2a6650cfafc-kube-api-access-nrw5z" (OuterVolumeSpecName: "kube-api-access-nrw5z") pod "d7fd2813-1349-41f2-bcee-c2a6650cfafc" (UID: "d7fd2813-1349-41f2-bcee-c2a6650cfafc"). InnerVolumeSpecName "kube-api-access-nrw5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.305086 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7fd2813-1349-41f2-bcee-c2a6650cfafc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d7fd2813-1349-41f2-bcee-c2a6650cfafc" (UID: "d7fd2813-1349-41f2-bcee-c2a6650cfafc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.320658 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.395935 4937 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7fd2813-1349-41f2-bcee-c2a6650cfafc-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.395978 4937 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7fd2813-1349-41f2-bcee-c2a6650cfafc-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.395991 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrw5z\" (UniqueName: \"kubernetes.io/projected/d7fd2813-1349-41f2-bcee-c2a6650cfafc-kube-api-access-nrw5z\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.551032 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.551407 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="ceilometer-central-agent" containerID="cri-o://7380a46d4a13eebb08f667fc944c523adef599821a7a2a344f2c535ed31ed736" gracePeriod=30 Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.551580 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="proxy-httpd" containerID="cri-o://e3d89d9183e76e7e19ae1f8aadba0108919676add1b655fc803e234fe6f91386" gracePeriod=30 Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.551642 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="sg-core" containerID="cri-o://3b8030e81823bf310684ff855a375a6ba05dbac1ba9f0a61fe66a8ce26b61481" gracePeriod=30 Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.551696 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="ceilometer-notification-agent" containerID="cri-o://da49f7b78ff9b1b0557982a4df2ebde43a9731336dca7991680119793c917b1d" gracePeriod=30 Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.634144 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" event={"ID":"d7fd2813-1349-41f2-bcee-c2a6650cfafc","Type":"ContainerDied","Data":"025c9d1cebc4abde4cbcb29133fb248c26bc035dda12b140cc1b76850383d1fc"} Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.634192 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="025c9d1cebc4abde4cbcb29133fb248c26bc035dda12b140cc1b76850383d1fc" Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.634274 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf" Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.640036 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2vtpf" podUID="2462d0bc-6986-4337-a05a-863a45a45393" containerName="registry-server" containerID="cri-o://e76eb058b9376dc15a57d0a8f480801464114829851c166f290a00336b05edcf" gracePeriod=2 Feb 25 16:15:04 crc kubenswrapper[4937]: I0225 16:15:04.924612 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 16:15:04 crc kubenswrapper[4937]: W0225 16:15:04.944047 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0312225_730b_46a3_8142_6a39e9d69f60.slice/crio-162bb59f90a0a3b351f0b7c79f9f8085180c0b07229fde6a4d5a030b0303b7b4 WatchSource:0}: Error finding container 162bb59f90a0a3b351f0b7c79f9f8085180c0b07229fde6a4d5a030b0303b7b4: Status 404 returned error can't find the container with id 162bb59f90a0a3b351f0b7c79f9f8085180c0b07229fde6a4d5a030b0303b7b4 Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.265774 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.380179 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34734ba-2195-4ea2-aa76-654c3c85a206" path="/var/lib/kubelet/pods/d34734ba-2195-4ea2-aa76-654c3c85a206/volumes" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.423247 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2462d0bc-6986-4337-a05a-863a45a45393-catalog-content\") pod \"2462d0bc-6986-4337-a05a-863a45a45393\" (UID: \"2462d0bc-6986-4337-a05a-863a45a45393\") " Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.423319 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlk9g\" (UniqueName: \"kubernetes.io/projected/2462d0bc-6986-4337-a05a-863a45a45393-kube-api-access-dlk9g\") pod \"2462d0bc-6986-4337-a05a-863a45a45393\" (UID: \"2462d0bc-6986-4337-a05a-863a45a45393\") " Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.423455 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2462d0bc-6986-4337-a05a-863a45a45393-utilities\") pod \"2462d0bc-6986-4337-a05a-863a45a45393\" (UID: \"2462d0bc-6986-4337-a05a-863a45a45393\") " Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.424060 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2462d0bc-6986-4337-a05a-863a45a45393-utilities" (OuterVolumeSpecName: "utilities") pod "2462d0bc-6986-4337-a05a-863a45a45393" (UID: "2462d0bc-6986-4337-a05a-863a45a45393"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.427223 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2462d0bc-6986-4337-a05a-863a45a45393-kube-api-access-dlk9g" (OuterVolumeSpecName: "kube-api-access-dlk9g") pod "2462d0bc-6986-4337-a05a-863a45a45393" (UID: "2462d0bc-6986-4337-a05a-863a45a45393"). InnerVolumeSpecName "kube-api-access-dlk9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.472254 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2462d0bc-6986-4337-a05a-863a45a45393-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2462d0bc-6986-4337-a05a-863a45a45393" (UID: "2462d0bc-6986-4337-a05a-863a45a45393"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.532052 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2462d0bc-6986-4337-a05a-863a45a45393-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.532087 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlk9g\" (UniqueName: \"kubernetes.io/projected/2462d0bc-6986-4337-a05a-863a45a45393-kube-api-access-dlk9g\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.532100 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2462d0bc-6986-4337-a05a-863a45a45393-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.568515 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.569024 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.631520 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.650816 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b0312225-730b-46a3-8142-6a39e9d69f60","Type":"ContainerStarted","Data":"b6207012e3608c6cc930676862361579b25fa18ade8a55e494a0fe79f74bf5cf"} Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.651095 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b0312225-730b-46a3-8142-6a39e9d69f60","Type":"ContainerStarted","Data":"162bb59f90a0a3b351f0b7c79f9f8085180c0b07229fde6a4d5a030b0303b7b4"} Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.651201 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.655528 4937 generic.go:334] "Generic (PLEG): container finished" podID="2462d0bc-6986-4337-a05a-863a45a45393" containerID="e76eb058b9376dc15a57d0a8f480801464114829851c166f290a00336b05edcf" exitCode=0 Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.655698 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2vtpf" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.655751 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2vtpf" event={"ID":"2462d0bc-6986-4337-a05a-863a45a45393","Type":"ContainerDied","Data":"e76eb058b9376dc15a57d0a8f480801464114829851c166f290a00336b05edcf"} Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.655872 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2vtpf" event={"ID":"2462d0bc-6986-4337-a05a-863a45a45393","Type":"ContainerDied","Data":"850f264ffaa1d6cbdabe55ca322e0926480f945f8acea36c9b2b7a648a12b164"} Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.655927 4937 scope.go:117] "RemoveContainer" containerID="e76eb058b9376dc15a57d0a8f480801464114829851c166f290a00336b05edcf" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.662321 4937 generic.go:334] "Generic (PLEG): container finished" podID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerID="e3d89d9183e76e7e19ae1f8aadba0108919676add1b655fc803e234fe6f91386" exitCode=0 Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.662729 4937 generic.go:334] "Generic (PLEG): container finished" podID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerID="3b8030e81823bf310684ff855a375a6ba05dbac1ba9f0a61fe66a8ce26b61481" exitCode=2 Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.662813 4937 generic.go:334] "Generic (PLEG): container finished" podID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerID="7380a46d4a13eebb08f667fc944c523adef599821a7a2a344f2c535ed31ed736" exitCode=0 Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.664105 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7030cdee-8f14-4cd8-959a-f941ac0414e9","Type":"ContainerDied","Data":"e3d89d9183e76e7e19ae1f8aadba0108919676add1b655fc803e234fe6f91386"} Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.664232 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7030cdee-8f14-4cd8-959a-f941ac0414e9","Type":"ContainerDied","Data":"3b8030e81823bf310684ff855a375a6ba05dbac1ba9f0a61fe66a8ce26b61481"} Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.664321 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7030cdee-8f14-4cd8-959a-f941ac0414e9","Type":"ContainerDied","Data":"7380a46d4a13eebb08f667fc944c523adef599821a7a2a344f2c535ed31ed736"} Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.682645 4937 scope.go:117] "RemoveContainer" containerID="f8c96562bdcba732dfa1548fc9056841f26f245f11b1d0e5fac61d7755725289" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.690148 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.303742313 podStartE2EDuration="2.690123069s" podCreationTimestamp="2026-02-25 16:15:03 +0000 UTC" firstStartedPulling="2026-02-25 16:15:04.948611711 +0000 UTC m=+1755.962003601" lastFinishedPulling="2026-02-25 16:15:05.334992467 +0000 UTC m=+1756.348384357" observedRunningTime="2026-02-25 16:15:05.672536099 +0000 UTC m=+1756.685927999" watchObservedRunningTime="2026-02-25 16:15:05.690123069 +0000 UTC m=+1756.703514969" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.716348 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2vtpf"] Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.718459 4937 scope.go:117] "RemoveContainer" containerID="61568512f7ea525a4b8390c901299ac1af75e738759daeb330cf920b82a9980e" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.727769 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2vtpf"] Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.737505 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.746851 4937 scope.go:117] "RemoveContainer" containerID="e76eb058b9376dc15a57d0a8f480801464114829851c166f290a00336b05edcf" Feb 25 16:15:05 crc kubenswrapper[4937]: E0225 16:15:05.747399 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e76eb058b9376dc15a57d0a8f480801464114829851c166f290a00336b05edcf\": container with ID starting with e76eb058b9376dc15a57d0a8f480801464114829851c166f290a00336b05edcf not found: ID does not exist" containerID="e76eb058b9376dc15a57d0a8f480801464114829851c166f290a00336b05edcf" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.747440 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e76eb058b9376dc15a57d0a8f480801464114829851c166f290a00336b05edcf"} err="failed to get container status \"e76eb058b9376dc15a57d0a8f480801464114829851c166f290a00336b05edcf\": rpc error: code = NotFound desc = could not find container \"e76eb058b9376dc15a57d0a8f480801464114829851c166f290a00336b05edcf\": container with ID starting with e76eb058b9376dc15a57d0a8f480801464114829851c166f290a00336b05edcf not found: ID does not exist" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.747472 4937 scope.go:117] "RemoveContainer" containerID="f8c96562bdcba732dfa1548fc9056841f26f245f11b1d0e5fac61d7755725289" Feb 25 16:15:05 crc kubenswrapper[4937]: E0225 16:15:05.747988 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8c96562bdcba732dfa1548fc9056841f26f245f11b1d0e5fac61d7755725289\": container with ID starting with f8c96562bdcba732dfa1548fc9056841f26f245f11b1d0e5fac61d7755725289 not found: ID does not exist" containerID="f8c96562bdcba732dfa1548fc9056841f26f245f11b1d0e5fac61d7755725289" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.748011 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8c96562bdcba732dfa1548fc9056841f26f245f11b1d0e5fac61d7755725289"} err="failed to get container status \"f8c96562bdcba732dfa1548fc9056841f26f245f11b1d0e5fac61d7755725289\": rpc error: code = NotFound desc = could not find container \"f8c96562bdcba732dfa1548fc9056841f26f245f11b1d0e5fac61d7755725289\": container with ID starting with f8c96562bdcba732dfa1548fc9056841f26f245f11b1d0e5fac61d7755725289 not found: ID does not exist" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.748029 4937 scope.go:117] "RemoveContainer" containerID="61568512f7ea525a4b8390c901299ac1af75e738759daeb330cf920b82a9980e" Feb 25 16:15:05 crc kubenswrapper[4937]: E0225 16:15:05.748257 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61568512f7ea525a4b8390c901299ac1af75e738759daeb330cf920b82a9980e\": container with ID starting with 61568512f7ea525a4b8390c901299ac1af75e738759daeb330cf920b82a9980e not found: ID does not exist" containerID="61568512f7ea525a4b8390c901299ac1af75e738759daeb330cf920b82a9980e" Feb 25 16:15:05 crc kubenswrapper[4937]: I0225 16:15:05.748278 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61568512f7ea525a4b8390c901299ac1af75e738759daeb330cf920b82a9980e"} err="failed to get container status \"61568512f7ea525a4b8390c901299ac1af75e738759daeb330cf920b82a9980e\": rpc error: code = NotFound desc = could not find container \"61568512f7ea525a4b8390c901299ac1af75e738759daeb330cf920b82a9980e\": container with ID starting with 61568512f7ea525a4b8390c901299ac1af75e738759daeb330cf920b82a9980e not found: ID does not exist" Feb 25 16:15:06 crc kubenswrapper[4937]: I0225 16:15:06.936772 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 16:15:06 crc kubenswrapper[4937]: I0225 16:15:06.937155 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 16:15:07 crc kubenswrapper[4937]: I0225 16:15:07.379344 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2462d0bc-6986-4337-a05a-863a45a45393" path="/var/lib/kubelet/pods/2462d0bc-6986-4337-a05a-863a45a45393/volumes" Feb 25 16:15:07 crc kubenswrapper[4937]: I0225 16:15:07.935432 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 25 16:15:07 crc kubenswrapper[4937]: I0225 16:15:07.970087 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 25 16:15:08 crc kubenswrapper[4937]: I0225 16:15:08.012820 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rpnbs"] Feb 25 16:15:08 crc kubenswrapper[4937]: I0225 16:15:08.020707 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.232:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 16:15:08 crc kubenswrapper[4937]: I0225 16:15:08.020762 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.232:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 16:15:08 crc kubenswrapper[4937]: I0225 16:15:08.693365 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rpnbs" podUID="4f500ae5-7183-44f5-ba92-08705866baf2" containerName="registry-server" containerID="cri-o://ad6b23cf8e3088029b0b7408619854bf0f7fcf480318c179ab82ba27f2a4ac89" gracePeriod=2 Feb 25 16:15:08 crc kubenswrapper[4937]: I0225 16:15:08.731096 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.533101 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.539091 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.637657 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6bk5\" (UniqueName: \"kubernetes.io/projected/4f500ae5-7183-44f5-ba92-08705866baf2-kube-api-access-j6bk5\") pod \"4f500ae5-7183-44f5-ba92-08705866baf2\" (UID: \"4f500ae5-7183-44f5-ba92-08705866baf2\") " Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.637737 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-scripts\") pod \"7030cdee-8f14-4cd8-959a-f941ac0414e9\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.637819 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pgvj\" (UniqueName: \"kubernetes.io/projected/7030cdee-8f14-4cd8-959a-f941ac0414e9-kube-api-access-7pgvj\") pod \"7030cdee-8f14-4cd8-959a-f941ac0414e9\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.638717 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-config-data\") pod \"7030cdee-8f14-4cd8-959a-f941ac0414e9\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.638804 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7030cdee-8f14-4cd8-959a-f941ac0414e9-run-httpd\") pod \"7030cdee-8f14-4cd8-959a-f941ac0414e9\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.638912 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-sg-core-conf-yaml\") pod \"7030cdee-8f14-4cd8-959a-f941ac0414e9\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.638936 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f500ae5-7183-44f5-ba92-08705866baf2-catalog-content\") pod \"4f500ae5-7183-44f5-ba92-08705866baf2\" (UID: \"4f500ae5-7183-44f5-ba92-08705866baf2\") " Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.639010 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7030cdee-8f14-4cd8-959a-f941ac0414e9-log-httpd\") pod \"7030cdee-8f14-4cd8-959a-f941ac0414e9\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.639034 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f500ae5-7183-44f5-ba92-08705866baf2-utilities\") pod \"4f500ae5-7183-44f5-ba92-08705866baf2\" (UID: \"4f500ae5-7183-44f5-ba92-08705866baf2\") " Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.639072 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-combined-ca-bundle\") pod \"7030cdee-8f14-4cd8-959a-f941ac0414e9\" (UID: \"7030cdee-8f14-4cd8-959a-f941ac0414e9\") " Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.639899 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7030cdee-8f14-4cd8-959a-f941ac0414e9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7030cdee-8f14-4cd8-959a-f941ac0414e9" (UID: "7030cdee-8f14-4cd8-959a-f941ac0414e9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.640389 4937 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7030cdee-8f14-4cd8-959a-f941ac0414e9-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.641156 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f500ae5-7183-44f5-ba92-08705866baf2-utilities" (OuterVolumeSpecName: "utilities") pod "4f500ae5-7183-44f5-ba92-08705866baf2" (UID: "4f500ae5-7183-44f5-ba92-08705866baf2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.641333 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7030cdee-8f14-4cd8-959a-f941ac0414e9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7030cdee-8f14-4cd8-959a-f941ac0414e9" (UID: "7030cdee-8f14-4cd8-959a-f941ac0414e9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.643899 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-scripts" (OuterVolumeSpecName: "scripts") pod "7030cdee-8f14-4cd8-959a-f941ac0414e9" (UID: "7030cdee-8f14-4cd8-959a-f941ac0414e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.646323 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7030cdee-8f14-4cd8-959a-f941ac0414e9-kube-api-access-7pgvj" (OuterVolumeSpecName: "kube-api-access-7pgvj") pod "7030cdee-8f14-4cd8-959a-f941ac0414e9" (UID: "7030cdee-8f14-4cd8-959a-f941ac0414e9"). InnerVolumeSpecName "kube-api-access-7pgvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.648691 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f500ae5-7183-44f5-ba92-08705866baf2-kube-api-access-j6bk5" (OuterVolumeSpecName: "kube-api-access-j6bk5") pod "4f500ae5-7183-44f5-ba92-08705866baf2" (UID: "4f500ae5-7183-44f5-ba92-08705866baf2"). InnerVolumeSpecName "kube-api-access-j6bk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.682879 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f500ae5-7183-44f5-ba92-08705866baf2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f500ae5-7183-44f5-ba92-08705866baf2" (UID: "4f500ae5-7183-44f5-ba92-08705866baf2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.682976 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7030cdee-8f14-4cd8-959a-f941ac0414e9" (UID: "7030cdee-8f14-4cd8-959a-f941ac0414e9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.704801 4937 generic.go:334] "Generic (PLEG): container finished" podID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerID="da49f7b78ff9b1b0557982a4df2ebde43a9731336dca7991680119793c917b1d" exitCode=0 Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.704869 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7030cdee-8f14-4cd8-959a-f941ac0414e9","Type":"ContainerDied","Data":"da49f7b78ff9b1b0557982a4df2ebde43a9731336dca7991680119793c917b1d"} Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.704904 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7030cdee-8f14-4cd8-959a-f941ac0414e9","Type":"ContainerDied","Data":"48448a2d7fdc06cb5701ddae94292da76ca94b3c5afdaae229c71463a10538ad"} Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.704934 4937 scope.go:117] "RemoveContainer" containerID="e3d89d9183e76e7e19ae1f8aadba0108919676add1b655fc803e234fe6f91386" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.705097 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.709384 4937 generic.go:334] "Generic (PLEG): container finished" podID="4f500ae5-7183-44f5-ba92-08705866baf2" containerID="ad6b23cf8e3088029b0b7408619854bf0f7fcf480318c179ab82ba27f2a4ac89" exitCode=0 Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.709554 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpnbs" event={"ID":"4f500ae5-7183-44f5-ba92-08705866baf2","Type":"ContainerDied","Data":"ad6b23cf8e3088029b0b7408619854bf0f7fcf480318c179ab82ba27f2a4ac89"} Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.709602 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpnbs" event={"ID":"4f500ae5-7183-44f5-ba92-08705866baf2","Type":"ContainerDied","Data":"0b475e2c3f2a9fe1dba1d237298ca39f709d2a83619cd84fc528132f669d5f62"} Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.709665 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rpnbs" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.727449 4937 scope.go:117] "RemoveContainer" containerID="3b8030e81823bf310684ff855a375a6ba05dbac1ba9f0a61fe66a8ce26b61481" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.732206 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7030cdee-8f14-4cd8-959a-f941ac0414e9" (UID: "7030cdee-8f14-4cd8-959a-f941ac0414e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.746292 4937 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7030cdee-8f14-4cd8-959a-f941ac0414e9-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.746326 4937 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.746335 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f500ae5-7183-44f5-ba92-08705866baf2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.746344 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f500ae5-7183-44f5-ba92-08705866baf2-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.746352 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.746361 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6bk5\" (UniqueName: \"kubernetes.io/projected/4f500ae5-7183-44f5-ba92-08705866baf2-kube-api-access-j6bk5\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.746370 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.746378 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pgvj\" (UniqueName: \"kubernetes.io/projected/7030cdee-8f14-4cd8-959a-f941ac0414e9-kube-api-access-7pgvj\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.746553 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rpnbs"] Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.749708 4937 scope.go:117] "RemoveContainer" containerID="da49f7b78ff9b1b0557982a4df2ebde43a9731336dca7991680119793c917b1d" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.756642 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rpnbs"] Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.767771 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-config-data" (OuterVolumeSpecName: "config-data") pod "7030cdee-8f14-4cd8-959a-f941ac0414e9" (UID: "7030cdee-8f14-4cd8-959a-f941ac0414e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.773930 4937 scope.go:117] "RemoveContainer" containerID="7380a46d4a13eebb08f667fc944c523adef599821a7a2a344f2c535ed31ed736" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.842974 4937 scope.go:117] "RemoveContainer" containerID="e3d89d9183e76e7e19ae1f8aadba0108919676add1b655fc803e234fe6f91386" Feb 25 16:15:09 crc kubenswrapper[4937]: E0225 16:15:09.844346 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3d89d9183e76e7e19ae1f8aadba0108919676add1b655fc803e234fe6f91386\": container with ID starting with e3d89d9183e76e7e19ae1f8aadba0108919676add1b655fc803e234fe6f91386 not found: ID does not exist" containerID="e3d89d9183e76e7e19ae1f8aadba0108919676add1b655fc803e234fe6f91386" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.844387 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3d89d9183e76e7e19ae1f8aadba0108919676add1b655fc803e234fe6f91386"} err="failed to get container status \"e3d89d9183e76e7e19ae1f8aadba0108919676add1b655fc803e234fe6f91386\": rpc error: code = NotFound desc = could not find container \"e3d89d9183e76e7e19ae1f8aadba0108919676add1b655fc803e234fe6f91386\": container with ID starting with e3d89d9183e76e7e19ae1f8aadba0108919676add1b655fc803e234fe6f91386 not found: ID does not exist" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.844889 4937 scope.go:117] "RemoveContainer" containerID="3b8030e81823bf310684ff855a375a6ba05dbac1ba9f0a61fe66a8ce26b61481" Feb 25 16:15:09 crc kubenswrapper[4937]: E0225 16:15:09.845320 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b8030e81823bf310684ff855a375a6ba05dbac1ba9f0a61fe66a8ce26b61481\": container with ID starting with 3b8030e81823bf310684ff855a375a6ba05dbac1ba9f0a61fe66a8ce26b61481 not found: ID does not exist" containerID="3b8030e81823bf310684ff855a375a6ba05dbac1ba9f0a61fe66a8ce26b61481" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.845367 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b8030e81823bf310684ff855a375a6ba05dbac1ba9f0a61fe66a8ce26b61481"} err="failed to get container status \"3b8030e81823bf310684ff855a375a6ba05dbac1ba9f0a61fe66a8ce26b61481\": rpc error: code = NotFound desc = could not find container \"3b8030e81823bf310684ff855a375a6ba05dbac1ba9f0a61fe66a8ce26b61481\": container with ID starting with 3b8030e81823bf310684ff855a375a6ba05dbac1ba9f0a61fe66a8ce26b61481 not found: ID does not exist" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.845402 4937 scope.go:117] "RemoveContainer" containerID="da49f7b78ff9b1b0557982a4df2ebde43a9731336dca7991680119793c917b1d" Feb 25 16:15:09 crc kubenswrapper[4937]: E0225 16:15:09.845726 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da49f7b78ff9b1b0557982a4df2ebde43a9731336dca7991680119793c917b1d\": container with ID starting with da49f7b78ff9b1b0557982a4df2ebde43a9731336dca7991680119793c917b1d not found: ID does not exist" containerID="da49f7b78ff9b1b0557982a4df2ebde43a9731336dca7991680119793c917b1d" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.845806 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da49f7b78ff9b1b0557982a4df2ebde43a9731336dca7991680119793c917b1d"} err="failed to get container status \"da49f7b78ff9b1b0557982a4df2ebde43a9731336dca7991680119793c917b1d\": rpc error: code = NotFound desc = could not find container \"da49f7b78ff9b1b0557982a4df2ebde43a9731336dca7991680119793c917b1d\": container with ID starting with da49f7b78ff9b1b0557982a4df2ebde43a9731336dca7991680119793c917b1d not found: ID does not exist" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.845830 4937 scope.go:117] "RemoveContainer" containerID="7380a46d4a13eebb08f667fc944c523adef599821a7a2a344f2c535ed31ed736" Feb 25 16:15:09 crc kubenswrapper[4937]: E0225 16:15:09.846073 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7380a46d4a13eebb08f667fc944c523adef599821a7a2a344f2c535ed31ed736\": container with ID starting with 7380a46d4a13eebb08f667fc944c523adef599821a7a2a344f2c535ed31ed736 not found: ID does not exist" containerID="7380a46d4a13eebb08f667fc944c523adef599821a7a2a344f2c535ed31ed736" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.846104 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7380a46d4a13eebb08f667fc944c523adef599821a7a2a344f2c535ed31ed736"} err="failed to get container status \"7380a46d4a13eebb08f667fc944c523adef599821a7a2a344f2c535ed31ed736\": rpc error: code = NotFound desc = could not find container \"7380a46d4a13eebb08f667fc944c523adef599821a7a2a344f2c535ed31ed736\": container with ID starting with 7380a46d4a13eebb08f667fc944c523adef599821a7a2a344f2c535ed31ed736 not found: ID does not exist" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.846125 4937 scope.go:117] "RemoveContainer" containerID="ad6b23cf8e3088029b0b7408619854bf0f7fcf480318c179ab82ba27f2a4ac89" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.848173 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7030cdee-8f14-4cd8-959a-f941ac0414e9-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.871116 4937 scope.go:117] "RemoveContainer" containerID="f6247a24d8e0d89e30289f868db570590ecc6ea28beb1125e427ab2b7465f21a" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.895765 4937 scope.go:117] "RemoveContainer" containerID="94525afcd9ad485a9dec3066d6ce8188a8c1f567f10b9428d00d9fabcbd82c23" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.946324 4937 scope.go:117] "RemoveContainer" containerID="ad6b23cf8e3088029b0b7408619854bf0f7fcf480318c179ab82ba27f2a4ac89" Feb 25 16:15:09 crc kubenswrapper[4937]: E0225 16:15:09.947041 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad6b23cf8e3088029b0b7408619854bf0f7fcf480318c179ab82ba27f2a4ac89\": container with ID starting with ad6b23cf8e3088029b0b7408619854bf0f7fcf480318c179ab82ba27f2a4ac89 not found: ID does not exist" containerID="ad6b23cf8e3088029b0b7408619854bf0f7fcf480318c179ab82ba27f2a4ac89" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.947090 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad6b23cf8e3088029b0b7408619854bf0f7fcf480318c179ab82ba27f2a4ac89"} err="failed to get container status \"ad6b23cf8e3088029b0b7408619854bf0f7fcf480318c179ab82ba27f2a4ac89\": rpc error: code = NotFound desc = could not find container \"ad6b23cf8e3088029b0b7408619854bf0f7fcf480318c179ab82ba27f2a4ac89\": container with ID starting with ad6b23cf8e3088029b0b7408619854bf0f7fcf480318c179ab82ba27f2a4ac89 not found: ID does not exist" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.947133 4937 scope.go:117] "RemoveContainer" containerID="f6247a24d8e0d89e30289f868db570590ecc6ea28beb1125e427ab2b7465f21a" Feb 25 16:15:09 crc kubenswrapper[4937]: E0225 16:15:09.950696 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6247a24d8e0d89e30289f868db570590ecc6ea28beb1125e427ab2b7465f21a\": container with ID starting with f6247a24d8e0d89e30289f868db570590ecc6ea28beb1125e427ab2b7465f21a not found: ID does not exist" containerID="f6247a24d8e0d89e30289f868db570590ecc6ea28beb1125e427ab2b7465f21a" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.950745 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6247a24d8e0d89e30289f868db570590ecc6ea28beb1125e427ab2b7465f21a"} err="failed to get container status \"f6247a24d8e0d89e30289f868db570590ecc6ea28beb1125e427ab2b7465f21a\": rpc error: code = NotFound desc = could not find container \"f6247a24d8e0d89e30289f868db570590ecc6ea28beb1125e427ab2b7465f21a\": container with ID starting with f6247a24d8e0d89e30289f868db570590ecc6ea28beb1125e427ab2b7465f21a not found: ID does not exist" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.950778 4937 scope.go:117] "RemoveContainer" containerID="94525afcd9ad485a9dec3066d6ce8188a8c1f567f10b9428d00d9fabcbd82c23" Feb 25 16:15:09 crc kubenswrapper[4937]: E0225 16:15:09.951117 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94525afcd9ad485a9dec3066d6ce8188a8c1f567f10b9428d00d9fabcbd82c23\": container with ID starting with 94525afcd9ad485a9dec3066d6ce8188a8c1f567f10b9428d00d9fabcbd82c23 not found: ID does not exist" containerID="94525afcd9ad485a9dec3066d6ce8188a8c1f567f10b9428d00d9fabcbd82c23" Feb 25 16:15:09 crc kubenswrapper[4937]: I0225 16:15:09.951157 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94525afcd9ad485a9dec3066d6ce8188a8c1f567f10b9428d00d9fabcbd82c23"} err="failed to get container status \"94525afcd9ad485a9dec3066d6ce8188a8c1f567f10b9428d00d9fabcbd82c23\": rpc error: code = NotFound desc = could not find container \"94525afcd9ad485a9dec3066d6ce8188a8c1f567f10b9428d00d9fabcbd82c23\": container with ID starting with 94525afcd9ad485a9dec3066d6ce8188a8c1f567f10b9428d00d9fabcbd82c23 not found: ID does not exist" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.038870 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.050318 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.062710 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:10 crc kubenswrapper[4937]: E0225 16:15:10.063276 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f500ae5-7183-44f5-ba92-08705866baf2" containerName="extract-utilities" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063298 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f500ae5-7183-44f5-ba92-08705866baf2" containerName="extract-utilities" Feb 25 16:15:10 crc kubenswrapper[4937]: E0225 16:15:10.063315 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7fd2813-1349-41f2-bcee-c2a6650cfafc" containerName="collect-profiles" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063324 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7fd2813-1349-41f2-bcee-c2a6650cfafc" containerName="collect-profiles" Feb 25 16:15:10 crc kubenswrapper[4937]: E0225 16:15:10.063351 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="proxy-httpd" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063359 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="proxy-httpd" Feb 25 16:15:10 crc kubenswrapper[4937]: E0225 16:15:10.063375 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f500ae5-7183-44f5-ba92-08705866baf2" containerName="registry-server" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063383 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f500ae5-7183-44f5-ba92-08705866baf2" containerName="registry-server" Feb 25 16:15:10 crc kubenswrapper[4937]: E0225 16:15:10.063396 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f500ae5-7183-44f5-ba92-08705866baf2" containerName="extract-content" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063403 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f500ae5-7183-44f5-ba92-08705866baf2" containerName="extract-content" Feb 25 16:15:10 crc kubenswrapper[4937]: E0225 16:15:10.063423 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2462d0bc-6986-4337-a05a-863a45a45393" containerName="extract-content" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063430 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2462d0bc-6986-4337-a05a-863a45a45393" containerName="extract-content" Feb 25 16:15:10 crc kubenswrapper[4937]: E0225 16:15:10.063445 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="ceilometer-central-agent" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063454 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="ceilometer-central-agent" Feb 25 16:15:10 crc kubenswrapper[4937]: E0225 16:15:10.063473 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="ceilometer-notification-agent" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063495 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="ceilometer-notification-agent" Feb 25 16:15:10 crc kubenswrapper[4937]: E0225 16:15:10.063513 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="sg-core" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063524 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="sg-core" Feb 25 16:15:10 crc kubenswrapper[4937]: E0225 16:15:10.063533 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2462d0bc-6986-4337-a05a-863a45a45393" containerName="extract-utilities" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063541 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2462d0bc-6986-4337-a05a-863a45a45393" containerName="extract-utilities" Feb 25 16:15:10 crc kubenswrapper[4937]: E0225 16:15:10.063552 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2462d0bc-6986-4337-a05a-863a45a45393" containerName="registry-server" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063559 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2462d0bc-6986-4337-a05a-863a45a45393" containerName="registry-server" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063801 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7fd2813-1349-41f2-bcee-c2a6650cfafc" containerName="collect-profiles" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063820 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="sg-core" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063834 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f500ae5-7183-44f5-ba92-08705866baf2" containerName="registry-server" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063849 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="2462d0bc-6986-4337-a05a-863a45a45393" containerName="registry-server" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063863 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="ceilometer-central-agent" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063876 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="ceilometer-notification-agent" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.063890 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" containerName="proxy-httpd" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.066323 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.069330 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.069441 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.069596 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.077091 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.153697 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-config-data\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.153752 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.153796 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkthh\" (UniqueName: \"kubernetes.io/projected/e007763c-989c-471e-9ca7-0cc784d0fe13-kube-api-access-pkthh\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.154098 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e007763c-989c-471e-9ca7-0cc784d0fe13-log-httpd\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.154249 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e007763c-989c-471e-9ca7-0cc784d0fe13-run-httpd\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.154285 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.154326 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-scripts\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.154361 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.256472 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkthh\" (UniqueName: \"kubernetes.io/projected/e007763c-989c-471e-9ca7-0cc784d0fe13-kube-api-access-pkthh\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.256587 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e007763c-989c-471e-9ca7-0cc784d0fe13-log-httpd\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.256621 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e007763c-989c-471e-9ca7-0cc784d0fe13-run-httpd\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.256636 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.256658 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-scripts\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.256682 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.256804 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-config-data\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.256823 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.257369 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e007763c-989c-471e-9ca7-0cc784d0fe13-run-httpd\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.257373 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e007763c-989c-471e-9ca7-0cc784d0fe13-log-httpd\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.261751 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.261923 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.261957 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-config-data\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.262597 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-scripts\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.263567 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.279873 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkthh\" (UniqueName: \"kubernetes.io/projected/e007763c-989c-471e-9ca7-0cc784d0fe13-kube-api-access-pkthh\") pod \"ceilometer-0\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.384605 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:15:10 crc kubenswrapper[4937]: I0225 16:15:10.863805 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:11 crc kubenswrapper[4937]: I0225 16:15:11.380571 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f500ae5-7183-44f5-ba92-08705866baf2" path="/var/lib/kubelet/pods/4f500ae5-7183-44f5-ba92-08705866baf2/volumes" Feb 25 16:15:11 crc kubenswrapper[4937]: I0225 16:15:11.381938 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7030cdee-8f14-4cd8-959a-f941ac0414e9" path="/var/lib/kubelet/pods/7030cdee-8f14-4cd8-959a-f941ac0414e9/volumes" Feb 25 16:15:11 crc kubenswrapper[4937]: I0225 16:15:11.740036 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e007763c-989c-471e-9ca7-0cc784d0fe13","Type":"ContainerStarted","Data":"27c27a3a7989f22fafb9190ee1177bd45dcc9acd1897b99b77c7af695f1cc43a"} Feb 25 16:15:11 crc kubenswrapper[4937]: I0225 16:15:11.740432 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e007763c-989c-471e-9ca7-0cc784d0fe13","Type":"ContainerStarted","Data":"18ed67e43a992bc28d9b443c1ae331761e2ec30f43685b22c7122cdb5d8c7448"} Feb 25 16:15:12 crc kubenswrapper[4937]: I0225 16:15:12.077955 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 25 16:15:12 crc kubenswrapper[4937]: I0225 16:15:12.084944 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 25 16:15:12 crc kubenswrapper[4937]: I0225 16:15:12.089661 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 25 16:15:12 crc kubenswrapper[4937]: I0225 16:15:12.750918 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e007763c-989c-471e-9ca7-0cc784d0fe13","Type":"ContainerStarted","Data":"603a897230b4fc9b402d32832cbd10809eaecc118b2b7018babb3c68d363bad3"} Feb 25 16:15:12 crc kubenswrapper[4937]: I0225 16:15:12.756369 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 25 16:15:13 crc kubenswrapper[4937]: I0225 16:15:13.769472 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e007763c-989c-471e-9ca7-0cc784d0fe13","Type":"ContainerStarted","Data":"2dffe981fcb879beabce176b0b77c6c9f4b7e580425b48e15a82d438e06e03ef"} Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.332833 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.686334 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.780537 4937 generic.go:334] "Generic (PLEG): container finished" podID="692747a2-c012-4264-8346-f4aa6755f93c" containerID="621a0e9511587d4cf0fcdbcba93a91cd49682927e54602f1421c64484dd6aaa1" exitCode=137 Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.780609 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.780652 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"692747a2-c012-4264-8346-f4aa6755f93c","Type":"ContainerDied","Data":"621a0e9511587d4cf0fcdbcba93a91cd49682927e54602f1421c64484dd6aaa1"} Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.780708 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"692747a2-c012-4264-8346-f4aa6755f93c","Type":"ContainerDied","Data":"795571e170388f79ef9c790373e9b93225a6a490e8909e22e6f5e9e1e3bfd58e"} Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.780731 4937 scope.go:117] "RemoveContainer" containerID="621a0e9511587d4cf0fcdbcba93a91cd49682927e54602f1421c64484dd6aaa1" Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.804540 4937 scope.go:117] "RemoveContainer" containerID="621a0e9511587d4cf0fcdbcba93a91cd49682927e54602f1421c64484dd6aaa1" Feb 25 16:15:14 crc kubenswrapper[4937]: E0225 16:15:14.805310 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"621a0e9511587d4cf0fcdbcba93a91cd49682927e54602f1421c64484dd6aaa1\": container with ID starting with 621a0e9511587d4cf0fcdbcba93a91cd49682927e54602f1421c64484dd6aaa1 not found: ID does not exist" containerID="621a0e9511587d4cf0fcdbcba93a91cd49682927e54602f1421c64484dd6aaa1" Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.805352 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"621a0e9511587d4cf0fcdbcba93a91cd49682927e54602f1421c64484dd6aaa1"} err="failed to get container status \"621a0e9511587d4cf0fcdbcba93a91cd49682927e54602f1421c64484dd6aaa1\": rpc error: code = NotFound desc = could not find container \"621a0e9511587d4cf0fcdbcba93a91cd49682927e54602f1421c64484dd6aaa1\": container with ID starting with 621a0e9511587d4cf0fcdbcba93a91cd49682927e54602f1421c64484dd6aaa1 not found: ID does not exist" Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.849968 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/692747a2-c012-4264-8346-f4aa6755f93c-config-data\") pod \"692747a2-c012-4264-8346-f4aa6755f93c\" (UID: \"692747a2-c012-4264-8346-f4aa6755f93c\") " Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.850211 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/692747a2-c012-4264-8346-f4aa6755f93c-combined-ca-bundle\") pod \"692747a2-c012-4264-8346-f4aa6755f93c\" (UID: \"692747a2-c012-4264-8346-f4aa6755f93c\") " Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.850379 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4mcf\" (UniqueName: \"kubernetes.io/projected/692747a2-c012-4264-8346-f4aa6755f93c-kube-api-access-t4mcf\") pod \"692747a2-c012-4264-8346-f4aa6755f93c\" (UID: \"692747a2-c012-4264-8346-f4aa6755f93c\") " Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.863661 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/692747a2-c012-4264-8346-f4aa6755f93c-kube-api-access-t4mcf" (OuterVolumeSpecName: "kube-api-access-t4mcf") pod "692747a2-c012-4264-8346-f4aa6755f93c" (UID: "692747a2-c012-4264-8346-f4aa6755f93c"). InnerVolumeSpecName "kube-api-access-t4mcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.884295 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/692747a2-c012-4264-8346-f4aa6755f93c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "692747a2-c012-4264-8346-f4aa6755f93c" (UID: "692747a2-c012-4264-8346-f4aa6755f93c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.898783 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/692747a2-c012-4264-8346-f4aa6755f93c-config-data" (OuterVolumeSpecName: "config-data") pod "692747a2-c012-4264-8346-f4aa6755f93c" (UID: "692747a2-c012-4264-8346-f4aa6755f93c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.953246 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4mcf\" (UniqueName: \"kubernetes.io/projected/692747a2-c012-4264-8346-f4aa6755f93c-kube-api-access-t4mcf\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.953278 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/692747a2-c012-4264-8346-f4aa6755f93c-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:14 crc kubenswrapper[4937]: I0225 16:15:14.953293 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/692747a2-c012-4264-8346-f4aa6755f93c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.132365 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.157507 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.168853 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 16:15:15 crc kubenswrapper[4937]: E0225 16:15:15.169417 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="692747a2-c012-4264-8346-f4aa6755f93c" containerName="nova-cell1-novncproxy-novncproxy" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.169445 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="692747a2-c012-4264-8346-f4aa6755f93c" containerName="nova-cell1-novncproxy-novncproxy" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.169768 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="692747a2-c012-4264-8346-f4aa6755f93c" containerName="nova-cell1-novncproxy-novncproxy" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.170741 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.172768 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.172964 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.173074 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.187999 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.260661 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d447480a-3bd1-4934-9ba7-73122b37df7c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.260731 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d447480a-3bd1-4934-9ba7-73122b37df7c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.260834 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d447480a-3bd1-4934-9ba7-73122b37df7c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.260930 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdvbx\" (UniqueName: \"kubernetes.io/projected/d447480a-3bd1-4934-9ba7-73122b37df7c-kube-api-access-bdvbx\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.260961 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d447480a-3bd1-4934-9ba7-73122b37df7c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.362713 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d447480a-3bd1-4934-9ba7-73122b37df7c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.362790 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d447480a-3bd1-4934-9ba7-73122b37df7c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.362888 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d447480a-3bd1-4934-9ba7-73122b37df7c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.362992 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdvbx\" (UniqueName: \"kubernetes.io/projected/d447480a-3bd1-4934-9ba7-73122b37df7c-kube-api-access-bdvbx\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.363022 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d447480a-3bd1-4934-9ba7-73122b37df7c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.367153 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d447480a-3bd1-4934-9ba7-73122b37df7c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.369040 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d447480a-3bd1-4934-9ba7-73122b37df7c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.369147 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d447480a-3bd1-4934-9ba7-73122b37df7c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.376516 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d447480a-3bd1-4934-9ba7-73122b37df7c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.383757 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="692747a2-c012-4264-8346-f4aa6755f93c" path="/var/lib/kubelet/pods/692747a2-c012-4264-8346-f4aa6755f93c/volumes" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.385206 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdvbx\" (UniqueName: \"kubernetes.io/projected/d447480a-3bd1-4934-9ba7-73122b37df7c-kube-api-access-bdvbx\") pod \"nova-cell1-novncproxy-0\" (UID: \"d447480a-3bd1-4934-9ba7-73122b37df7c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.512740 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.791866 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e007763c-989c-471e-9ca7-0cc784d0fe13","Type":"ContainerStarted","Data":"1b3e74b8e02f0c91d714018623caa680fc88d56534546efc29774e050b722c11"} Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.792050 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 16:15:15 crc kubenswrapper[4937]: I0225 16:15:15.822415 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.783986417 podStartE2EDuration="5.822393456s" podCreationTimestamp="2026-02-25 16:15:10 +0000 UTC" firstStartedPulling="2026-02-25 16:15:10.873069841 +0000 UTC m=+1761.886461741" lastFinishedPulling="2026-02-25 16:15:14.91147689 +0000 UTC m=+1765.924868780" observedRunningTime="2026-02-25 16:15:15.82053328 +0000 UTC m=+1766.833925190" watchObservedRunningTime="2026-02-25 16:15:15.822393456 +0000 UTC m=+1766.835785356" Feb 25 16:15:16 crc kubenswrapper[4937]: I0225 16:15:16.022617 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 16:15:16 crc kubenswrapper[4937]: W0225 16:15:16.028060 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd447480a_3bd1_4934_9ba7_73122b37df7c.slice/crio-064c6e2169afbd3f370bda8394dbb944b4433cf4610add97da4acbab60653b9d WatchSource:0}: Error finding container 064c6e2169afbd3f370bda8394dbb944b4433cf4610add97da4acbab60653b9d: Status 404 returned error can't find the container with id 064c6e2169afbd3f370bda8394dbb944b4433cf4610add97da4acbab60653b9d Feb 25 16:15:16 crc kubenswrapper[4937]: I0225 16:15:16.367944 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:15:16 crc kubenswrapper[4937]: E0225 16:15:16.368441 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:15:16 crc kubenswrapper[4937]: I0225 16:15:16.803369 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d447480a-3bd1-4934-9ba7-73122b37df7c","Type":"ContainerStarted","Data":"d4d26bb5e9459398deaf0eefa62f87fb43b2c81859bde3d72e89033fe4ad9cdd"} Feb 25 16:15:16 crc kubenswrapper[4937]: I0225 16:15:16.803420 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d447480a-3bd1-4934-9ba7-73122b37df7c","Type":"ContainerStarted","Data":"064c6e2169afbd3f370bda8394dbb944b4433cf4610add97da4acbab60653b9d"} Feb 25 16:15:16 crc kubenswrapper[4937]: I0225 16:15:16.836863 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.836844717 podStartE2EDuration="1.836844717s" podCreationTimestamp="2026-02-25 16:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:15:16.825852821 +0000 UTC m=+1767.839244711" watchObservedRunningTime="2026-02-25 16:15:16.836844717 +0000 UTC m=+1767.850236607" Feb 25 16:15:16 crc kubenswrapper[4937]: I0225 16:15:16.939672 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 25 16:15:16 crc kubenswrapper[4937]: I0225 16:15:16.940098 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 25 16:15:16 crc kubenswrapper[4937]: I0225 16:15:16.942412 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 25 16:15:16 crc kubenswrapper[4937]: I0225 16:15:16.943710 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 25 16:15:17 crc kubenswrapper[4937]: I0225 16:15:17.814516 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 25 16:15:17 crc kubenswrapper[4937]: I0225 16:15:17.817714 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.006589 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-bkv5w"] Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.008690 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.021278 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-bkv5w"] Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.151115 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.151188 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-config\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.151213 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.151326 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.151408 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.151721 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbqb4\" (UniqueName: \"kubernetes.io/projected/f5fe867a-9b48-43f5-b336-107761af2328-kube-api-access-fbqb4\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.253587 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-config\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.253633 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.253725 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.253772 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.253825 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbqb4\" (UniqueName: \"kubernetes.io/projected/f5fe867a-9b48-43f5-b336-107761af2328-kube-api-access-fbqb4\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.253846 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.254703 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-config\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.254712 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.254776 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.254801 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.254917 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.275412 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbqb4\" (UniqueName: \"kubernetes.io/projected/f5fe867a-9b48-43f5-b336-107761af2328-kube-api-access-fbqb4\") pod \"dnsmasq-dns-5fd9b586ff-bkv5w\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.337444 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:18 crc kubenswrapper[4937]: W0225 16:15:18.835764 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5fe867a_9b48_43f5_b336_107761af2328.slice/crio-89976c099320a009c27438e24538906088c48c55ca9eb7de0b81f2c1915f3e3e WatchSource:0}: Error finding container 89976c099320a009c27438e24538906088c48c55ca9eb7de0b81f2c1915f3e3e: Status 404 returned error can't find the container with id 89976c099320a009c27438e24538906088c48c55ca9eb7de0b81f2c1915f3e3e Feb 25 16:15:18 crc kubenswrapper[4937]: I0225 16:15:18.840311 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-bkv5w"] Feb 25 16:15:19 crc kubenswrapper[4937]: I0225 16:15:19.835302 4937 generic.go:334] "Generic (PLEG): container finished" podID="f5fe867a-9b48-43f5-b336-107761af2328" containerID="dedd5a7aa711e5d16d70d5f5a0c88c9388e40acb4df1aa76452fcd6fba141d47" exitCode=0 Feb 25 16:15:19 crc kubenswrapper[4937]: I0225 16:15:19.835413 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" event={"ID":"f5fe867a-9b48-43f5-b336-107761af2328","Type":"ContainerDied","Data":"dedd5a7aa711e5d16d70d5f5a0c88c9388e40acb4df1aa76452fcd6fba141d47"} Feb 25 16:15:19 crc kubenswrapper[4937]: I0225 16:15:19.836253 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" event={"ID":"f5fe867a-9b48-43f5-b336-107761af2328","Type":"ContainerStarted","Data":"89976c099320a009c27438e24538906088c48c55ca9eb7de0b81f2c1915f3e3e"} Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.008157 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.008504 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="ceilometer-notification-agent" containerID="cri-o://603a897230b4fc9b402d32832cbd10809eaecc118b2b7018babb3c68d363bad3" gracePeriod=30 Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.008578 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="sg-core" containerID="cri-o://2dffe981fcb879beabce176b0b77c6c9f4b7e580425b48e15a82d438e06e03ef" gracePeriod=30 Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.008817 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="proxy-httpd" containerID="cri-o://1b3e74b8e02f0c91d714018623caa680fc88d56534546efc29774e050b722c11" gracePeriod=30 Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.009725 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="ceilometer-central-agent" containerID="cri-o://27c27a3a7989f22fafb9190ee1177bd45dcc9acd1897b99b77c7af695f1cc43a" gracePeriod=30 Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.488190 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.545719 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.856063 4937 generic.go:334] "Generic (PLEG): container finished" podID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerID="1b3e74b8e02f0c91d714018623caa680fc88d56534546efc29774e050b722c11" exitCode=0 Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.857281 4937 generic.go:334] "Generic (PLEG): container finished" podID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerID="2dffe981fcb879beabce176b0b77c6c9f4b7e580425b48e15a82d438e06e03ef" exitCode=2 Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.857351 4937 generic.go:334] "Generic (PLEG): container finished" podID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerID="603a897230b4fc9b402d32832cbd10809eaecc118b2b7018babb3c68d363bad3" exitCode=0 Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.857406 4937 generic.go:334] "Generic (PLEG): container finished" podID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerID="27c27a3a7989f22fafb9190ee1177bd45dcc9acd1897b99b77c7af695f1cc43a" exitCode=0 Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.856168 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e007763c-989c-471e-9ca7-0cc784d0fe13","Type":"ContainerDied","Data":"1b3e74b8e02f0c91d714018623caa680fc88d56534546efc29774e050b722c11"} Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.857618 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e007763c-989c-471e-9ca7-0cc784d0fe13","Type":"ContainerDied","Data":"2dffe981fcb879beabce176b0b77c6c9f4b7e580425b48e15a82d438e06e03ef"} Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.857690 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e007763c-989c-471e-9ca7-0cc784d0fe13","Type":"ContainerDied","Data":"603a897230b4fc9b402d32832cbd10809eaecc118b2b7018babb3c68d363bad3"} Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.857762 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e007763c-989c-471e-9ca7-0cc784d0fe13","Type":"ContainerDied","Data":"27c27a3a7989f22fafb9190ee1177bd45dcc9acd1897b99b77c7af695f1cc43a"} Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.880279 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" containerName="nova-api-log" containerID="cri-o://af41b91a5c20d45867b8bedc7d52fbc739737475d33b94bc2b7b07c71264348b" gracePeriod=30 Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.880980 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" event={"ID":"f5fe867a-9b48-43f5-b336-107761af2328","Type":"ContainerStarted","Data":"dd771656c0f5225b3d8beb28d9f0cbec718d26324c36001d17cfbe8e4d8740e2"} Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.881242 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" containerName="nova-api-api" containerID="cri-o://eebeb71401ee8dbae25a2c2de0f73b998b7d8939400b35d488f079c202ce1e76" gracePeriod=30 Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.881878 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:20 crc kubenswrapper[4937]: I0225 16:15:20.917267 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" podStartSLOduration=3.917252619 podStartE2EDuration="3.917252619s" podCreationTimestamp="2026-02-25 16:15:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:15:20.916556061 +0000 UTC m=+1771.929947951" watchObservedRunningTime="2026-02-25 16:15:20.917252619 +0000 UTC m=+1771.930644509" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.193262 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.362777 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e007763c-989c-471e-9ca7-0cc784d0fe13-log-httpd\") pod \"e007763c-989c-471e-9ca7-0cc784d0fe13\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.362888 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-ceilometer-tls-certs\") pod \"e007763c-989c-471e-9ca7-0cc784d0fe13\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.362985 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkthh\" (UniqueName: \"kubernetes.io/projected/e007763c-989c-471e-9ca7-0cc784d0fe13-kube-api-access-pkthh\") pod \"e007763c-989c-471e-9ca7-0cc784d0fe13\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.363027 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e007763c-989c-471e-9ca7-0cc784d0fe13-run-httpd\") pod \"e007763c-989c-471e-9ca7-0cc784d0fe13\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.363093 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-sg-core-conf-yaml\") pod \"e007763c-989c-471e-9ca7-0cc784d0fe13\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.363119 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e007763c-989c-471e-9ca7-0cc784d0fe13-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e007763c-989c-471e-9ca7-0cc784d0fe13" (UID: "e007763c-989c-471e-9ca7-0cc784d0fe13"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.363339 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-config-data\") pod \"e007763c-989c-471e-9ca7-0cc784d0fe13\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.363515 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-scripts\") pod \"e007763c-989c-471e-9ca7-0cc784d0fe13\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.363629 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e007763c-989c-471e-9ca7-0cc784d0fe13-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e007763c-989c-471e-9ca7-0cc784d0fe13" (UID: "e007763c-989c-471e-9ca7-0cc784d0fe13"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.363811 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-combined-ca-bundle\") pod \"e007763c-989c-471e-9ca7-0cc784d0fe13\" (UID: \"e007763c-989c-471e-9ca7-0cc784d0fe13\") " Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.364864 4937 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e007763c-989c-471e-9ca7-0cc784d0fe13-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.365146 4937 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e007763c-989c-471e-9ca7-0cc784d0fe13-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.369124 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e007763c-989c-471e-9ca7-0cc784d0fe13-kube-api-access-pkthh" (OuterVolumeSpecName: "kube-api-access-pkthh") pod "e007763c-989c-471e-9ca7-0cc784d0fe13" (UID: "e007763c-989c-471e-9ca7-0cc784d0fe13"). InnerVolumeSpecName "kube-api-access-pkthh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.388668 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-scripts" (OuterVolumeSpecName: "scripts") pod "e007763c-989c-471e-9ca7-0cc784d0fe13" (UID: "e007763c-989c-471e-9ca7-0cc784d0fe13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.405335 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e007763c-989c-471e-9ca7-0cc784d0fe13" (UID: "e007763c-989c-471e-9ca7-0cc784d0fe13"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.454725 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e007763c-989c-471e-9ca7-0cc784d0fe13" (UID: "e007763c-989c-471e-9ca7-0cc784d0fe13"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.457888 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e007763c-989c-471e-9ca7-0cc784d0fe13" (UID: "e007763c-989c-471e-9ca7-0cc784d0fe13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.467988 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.468032 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.468045 4937 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.468054 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkthh\" (UniqueName: \"kubernetes.io/projected/e007763c-989c-471e-9ca7-0cc784d0fe13-kube-api-access-pkthh\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.468063 4937 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.534230 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-config-data" (OuterVolumeSpecName: "config-data") pod "e007763c-989c-471e-9ca7-0cc784d0fe13" (UID: "e007763c-989c-471e-9ca7-0cc784d0fe13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.570112 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e007763c-989c-471e-9ca7-0cc784d0fe13-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.892940 4937 generic.go:334] "Generic (PLEG): container finished" podID="4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" containerID="af41b91a5c20d45867b8bedc7d52fbc739737475d33b94bc2b7b07c71264348b" exitCode=143 Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.893022 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497","Type":"ContainerDied","Data":"af41b91a5c20d45867b8bedc7d52fbc739737475d33b94bc2b7b07c71264348b"} Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.895885 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.895894 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e007763c-989c-471e-9ca7-0cc784d0fe13","Type":"ContainerDied","Data":"18ed67e43a992bc28d9b443c1ae331761e2ec30f43685b22c7122cdb5d8c7448"} Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.895938 4937 scope.go:117] "RemoveContainer" containerID="1b3e74b8e02f0c91d714018623caa680fc88d56534546efc29774e050b722c11" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.929823 4937 scope.go:117] "RemoveContainer" containerID="2dffe981fcb879beabce176b0b77c6c9f4b7e580425b48e15a82d438e06e03ef" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.943263 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.953112 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.969159 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:21 crc kubenswrapper[4937]: E0225 16:15:21.969624 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="ceilometer-notification-agent" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.969643 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="ceilometer-notification-agent" Feb 25 16:15:21 crc kubenswrapper[4937]: E0225 16:15:21.969666 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="ceilometer-central-agent" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.969672 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="ceilometer-central-agent" Feb 25 16:15:21 crc kubenswrapper[4937]: E0225 16:15:21.969687 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="proxy-httpd" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.969693 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="proxy-httpd" Feb 25 16:15:21 crc kubenswrapper[4937]: E0225 16:15:21.969703 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="sg-core" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.969711 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="sg-core" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.969936 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="ceilometer-central-agent" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.969948 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="ceilometer-notification-agent" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.969964 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="sg-core" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.969979 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" containerName="proxy-httpd" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.974665 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.978024 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.978101 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.978232 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.979401 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:21 crc kubenswrapper[4937]: I0225 16:15:21.995832 4937 scope.go:117] "RemoveContainer" containerID="603a897230b4fc9b402d32832cbd10809eaecc118b2b7018babb3c68d363bad3" Feb 25 16:15:22 crc kubenswrapper[4937]: E0225 16:15:22.046772 4937 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode007763c_989c_471e_9ca7_0cc784d0fe13.slice\": RecentStats: unable to find data in memory cache]" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.049695 4937 scope.go:117] "RemoveContainer" containerID="27c27a3a7989f22fafb9190ee1177bd45dcc9acd1897b99b77c7af695f1cc43a" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.117970 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.118026 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b177e50-d02b-4342-b340-5aaae16d6d9d-run-httpd\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.118057 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw4qp\" (UniqueName: \"kubernetes.io/projected/5b177e50-d02b-4342-b340-5aaae16d6d9d-kube-api-access-vw4qp\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.118105 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-config-data\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.118204 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-scripts\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.118247 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.118272 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.118344 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b177e50-d02b-4342-b340-5aaae16d6d9d-log-httpd\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.220328 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-config-data\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.220733 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-scripts\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.220757 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.220778 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.221337 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b177e50-d02b-4342-b340-5aaae16d6d9d-log-httpd\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.221472 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.221525 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b177e50-d02b-4342-b340-5aaae16d6d9d-run-httpd\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.221545 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b177e50-d02b-4342-b340-5aaae16d6d9d-log-httpd\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.221556 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw4qp\" (UniqueName: \"kubernetes.io/projected/5b177e50-d02b-4342-b340-5aaae16d6d9d-kube-api-access-vw4qp\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.222098 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b177e50-d02b-4342-b340-5aaae16d6d9d-run-httpd\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.226017 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.226635 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-config-data\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.227435 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-scripts\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.227565 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.235248 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.244536 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw4qp\" (UniqueName: \"kubernetes.io/projected/5b177e50-d02b-4342-b340-5aaae16d6d9d-kube-api-access-vw4qp\") pod \"ceilometer-0\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.306332 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.339202 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.845434 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:22 crc kubenswrapper[4937]: I0225 16:15:22.915395 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b177e50-d02b-4342-b340-5aaae16d6d9d","Type":"ContainerStarted","Data":"caa15fef17152e718e8c5dc86381b5ed25a31f96cd1d940f35661d7bb24e4ec5"} Feb 25 16:15:23 crc kubenswrapper[4937]: I0225 16:15:23.381653 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e007763c-989c-471e-9ca7-0cc784d0fe13" path="/var/lib/kubelet/pods/e007763c-989c-471e-9ca7-0cc784d0fe13/volumes" Feb 25 16:15:23 crc kubenswrapper[4937]: I0225 16:15:23.932074 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b177e50-d02b-4342-b340-5aaae16d6d9d","Type":"ContainerStarted","Data":"1c07f64dc1c0cb3ad965e9159165717b27afe6a5eb982f40e23424b490c3f60e"} Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.635014 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.693413 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-logs\") pod \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.693814 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-logs" (OuterVolumeSpecName: "logs") pod "4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" (UID: "4dc3b1d2-cd5c-417b-b62f-e5fbc2473497"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.693852 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6vrc\" (UniqueName: \"kubernetes.io/projected/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-kube-api-access-n6vrc\") pod \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.693903 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-combined-ca-bundle\") pod \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.694060 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-config-data\") pod \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\" (UID: \"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497\") " Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.694799 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.703557 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-kube-api-access-n6vrc" (OuterVolumeSpecName: "kube-api-access-n6vrc") pod "4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" (UID: "4dc3b1d2-cd5c-417b-b62f-e5fbc2473497"). InnerVolumeSpecName "kube-api-access-n6vrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.741033 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-config-data" (OuterVolumeSpecName: "config-data") pod "4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" (UID: "4dc3b1d2-cd5c-417b-b62f-e5fbc2473497"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.757622 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" (UID: "4dc3b1d2-cd5c-417b-b62f-e5fbc2473497"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.797503 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6vrc\" (UniqueName: \"kubernetes.io/projected/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-kube-api-access-n6vrc\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.797535 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.797544 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.945993 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.946016 4937 generic.go:334] "Generic (PLEG): container finished" podID="4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" containerID="eebeb71401ee8dbae25a2c2de0f73b998b7d8939400b35d488f079c202ce1e76" exitCode=0 Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.946100 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497","Type":"ContainerDied","Data":"eebeb71401ee8dbae25a2c2de0f73b998b7d8939400b35d488f079c202ce1e76"} Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.946131 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4dc3b1d2-cd5c-417b-b62f-e5fbc2473497","Type":"ContainerDied","Data":"c81d36e436d352ee6de42e85b8668cb283e3765aaaa25664c07a2365f5283873"} Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.946148 4937 scope.go:117] "RemoveContainer" containerID="eebeb71401ee8dbae25a2c2de0f73b998b7d8939400b35d488f079c202ce1e76" Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.951140 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b177e50-d02b-4342-b340-5aaae16d6d9d","Type":"ContainerStarted","Data":"405182a084014d6ab8dcf89a2de489c34e04d8390fb71d0c168acdf651d778a8"} Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.951190 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b177e50-d02b-4342-b340-5aaae16d6d9d","Type":"ContainerStarted","Data":"b8f9a7333dd514030ac779957f78cdc1979255b624ac44b51d557b8ee3e5d324"} Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.981873 4937 scope.go:117] "RemoveContainer" containerID="af41b91a5c20d45867b8bedc7d52fbc739737475d33b94bc2b7b07c71264348b" Feb 25 16:15:24 crc kubenswrapper[4937]: I0225 16:15:24.986627 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.006920 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.008105 4937 scope.go:117] "RemoveContainer" containerID="eebeb71401ee8dbae25a2c2de0f73b998b7d8939400b35d488f079c202ce1e76" Feb 25 16:15:25 crc kubenswrapper[4937]: E0225 16:15:25.009729 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eebeb71401ee8dbae25a2c2de0f73b998b7d8939400b35d488f079c202ce1e76\": container with ID starting with eebeb71401ee8dbae25a2c2de0f73b998b7d8939400b35d488f079c202ce1e76 not found: ID does not exist" containerID="eebeb71401ee8dbae25a2c2de0f73b998b7d8939400b35d488f079c202ce1e76" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.009773 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eebeb71401ee8dbae25a2c2de0f73b998b7d8939400b35d488f079c202ce1e76"} err="failed to get container status \"eebeb71401ee8dbae25a2c2de0f73b998b7d8939400b35d488f079c202ce1e76\": rpc error: code = NotFound desc = could not find container \"eebeb71401ee8dbae25a2c2de0f73b998b7d8939400b35d488f079c202ce1e76\": container with ID starting with eebeb71401ee8dbae25a2c2de0f73b998b7d8939400b35d488f079c202ce1e76 not found: ID does not exist" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.009807 4937 scope.go:117] "RemoveContainer" containerID="af41b91a5c20d45867b8bedc7d52fbc739737475d33b94bc2b7b07c71264348b" Feb 25 16:15:25 crc kubenswrapper[4937]: E0225 16:15:25.011997 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af41b91a5c20d45867b8bedc7d52fbc739737475d33b94bc2b7b07c71264348b\": container with ID starting with af41b91a5c20d45867b8bedc7d52fbc739737475d33b94bc2b7b07c71264348b not found: ID does not exist" containerID="af41b91a5c20d45867b8bedc7d52fbc739737475d33b94bc2b7b07c71264348b" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.012032 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af41b91a5c20d45867b8bedc7d52fbc739737475d33b94bc2b7b07c71264348b"} err="failed to get container status \"af41b91a5c20d45867b8bedc7d52fbc739737475d33b94bc2b7b07c71264348b\": rpc error: code = NotFound desc = could not find container \"af41b91a5c20d45867b8bedc7d52fbc739737475d33b94bc2b7b07c71264348b\": container with ID starting with af41b91a5c20d45867b8bedc7d52fbc739737475d33b94bc2b7b07c71264348b not found: ID does not exist" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.027984 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 25 16:15:25 crc kubenswrapper[4937]: E0225 16:15:25.028550 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" containerName="nova-api-log" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.028571 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" containerName="nova-api-log" Feb 25 16:15:25 crc kubenswrapper[4937]: E0225 16:15:25.028600 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" containerName="nova-api-api" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.028608 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" containerName="nova-api-api" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.028796 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" containerName="nova-api-api" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.028817 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" containerName="nova-api-log" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.030021 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.041779 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.043281 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.043362 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.043568 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.103939 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.104058 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d227184-5e5f-4de7-b13a-1d38af727834-logs\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.104097 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-public-tls-certs\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.104267 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-config-data\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.104305 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.104450 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qwfb\" (UniqueName: \"kubernetes.io/projected/5d227184-5e5f-4de7-b13a-1d38af727834-kube-api-access-8qwfb\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.206318 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-public-tls-certs\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.208272 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-config-data\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.208354 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.208716 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qwfb\" (UniqueName: \"kubernetes.io/projected/5d227184-5e5f-4de7-b13a-1d38af727834-kube-api-access-8qwfb\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.208985 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.209160 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d227184-5e5f-4de7-b13a-1d38af727834-logs\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.209801 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d227184-5e5f-4de7-b13a-1d38af727834-logs\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.210504 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-public-tls-certs\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.213808 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.214092 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-config-data\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.214153 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.231633 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qwfb\" (UniqueName: \"kubernetes.io/projected/5d227184-5e5f-4de7-b13a-1d38af727834-kube-api-access-8qwfb\") pod \"nova-api-0\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.351643 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.384118 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dc3b1d2-cd5c-417b-b62f-e5fbc2473497" path="/var/lib/kubelet/pods/4dc3b1d2-cd5c-417b-b62f-e5fbc2473497/volumes" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.513841 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.544979 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.858852 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:15:25 crc kubenswrapper[4937]: I0225 16:15:25.983095 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d227184-5e5f-4de7-b13a-1d38af727834","Type":"ContainerStarted","Data":"e71314ba221aed314943e06e65f6e6675f5008e2ff2df0d582bd03afc2ec4235"} Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.004917 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.209511 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-7pb8b"] Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.212071 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.218772 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.218961 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.219881 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7pb8b"] Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.331265 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crkzk\" (UniqueName: \"kubernetes.io/projected/18c91384-797d-4ca1-8a29-f800994d26b7-kube-api-access-crkzk\") pod \"nova-cell1-cell-mapping-7pb8b\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.331498 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-config-data\") pod \"nova-cell1-cell-mapping-7pb8b\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.331772 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7pb8b\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.331878 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-scripts\") pod \"nova-cell1-cell-mapping-7pb8b\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.433463 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-config-data\") pod \"nova-cell1-cell-mapping-7pb8b\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.433678 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7pb8b\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.433728 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-scripts\") pod \"nova-cell1-cell-mapping-7pb8b\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.433828 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crkzk\" (UniqueName: \"kubernetes.io/projected/18c91384-797d-4ca1-8a29-f800994d26b7-kube-api-access-crkzk\") pod \"nova-cell1-cell-mapping-7pb8b\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.438666 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7pb8b\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.439340 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-config-data\") pod \"nova-cell1-cell-mapping-7pb8b\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.450749 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-scripts\") pod \"nova-cell1-cell-mapping-7pb8b\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.454154 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crkzk\" (UniqueName: \"kubernetes.io/projected/18c91384-797d-4ca1-8a29-f800994d26b7-kube-api-access-crkzk\") pod \"nova-cell1-cell-mapping-7pb8b\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:26 crc kubenswrapper[4937]: I0225 16:15:26.580948 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:27 crc kubenswrapper[4937]: I0225 16:15:27.012849 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d227184-5e5f-4de7-b13a-1d38af727834","Type":"ContainerStarted","Data":"8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721"} Feb 25 16:15:27 crc kubenswrapper[4937]: I0225 16:15:27.013450 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d227184-5e5f-4de7-b13a-1d38af727834","Type":"ContainerStarted","Data":"92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010"} Feb 25 16:15:27 crc kubenswrapper[4937]: I0225 16:15:27.022975 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="ceilometer-central-agent" containerID="cri-o://1c07f64dc1c0cb3ad965e9159165717b27afe6a5eb982f40e23424b490c3f60e" gracePeriod=30 Feb 25 16:15:27 crc kubenswrapper[4937]: I0225 16:15:27.023274 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b177e50-d02b-4342-b340-5aaae16d6d9d","Type":"ContainerStarted","Data":"3a6b7c334376bbde55c11093d109d72043191d8329c21d12a66602c395b1900b"} Feb 25 16:15:27 crc kubenswrapper[4937]: I0225 16:15:27.023313 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 16:15:27 crc kubenswrapper[4937]: I0225 16:15:27.023351 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="proxy-httpd" containerID="cri-o://3a6b7c334376bbde55c11093d109d72043191d8329c21d12a66602c395b1900b" gracePeriod=30 Feb 25 16:15:27 crc kubenswrapper[4937]: I0225 16:15:27.023396 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="sg-core" containerID="cri-o://405182a084014d6ab8dcf89a2de489c34e04d8390fb71d0c168acdf651d778a8" gracePeriod=30 Feb 25 16:15:27 crc kubenswrapper[4937]: I0225 16:15:27.023428 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="ceilometer-notification-agent" containerID="cri-o://b8f9a7333dd514030ac779957f78cdc1979255b624ac44b51d557b8ee3e5d324" gracePeriod=30 Feb 25 16:15:27 crc kubenswrapper[4937]: I0225 16:15:27.031846 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.031830885 podStartE2EDuration="3.031830885s" podCreationTimestamp="2026-02-25 16:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:15:27.03125173 +0000 UTC m=+1778.044643620" watchObservedRunningTime="2026-02-25 16:15:27.031830885 +0000 UTC m=+1778.045222765" Feb 25 16:15:27 crc kubenswrapper[4937]: I0225 16:15:27.073543 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.263865116 podStartE2EDuration="6.07352285s" podCreationTimestamp="2026-02-25 16:15:21 +0000 UTC" firstStartedPulling="2026-02-25 16:15:22.847049317 +0000 UTC m=+1773.860441207" lastFinishedPulling="2026-02-25 16:15:26.656707051 +0000 UTC m=+1777.670098941" observedRunningTime="2026-02-25 16:15:27.053103918 +0000 UTC m=+1778.066495808" watchObservedRunningTime="2026-02-25 16:15:27.07352285 +0000 UTC m=+1778.086914740" Feb 25 16:15:27 crc kubenswrapper[4937]: I0225 16:15:27.183583 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7pb8b"] Feb 25 16:15:28 crc kubenswrapper[4937]: I0225 16:15:28.032692 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7pb8b" event={"ID":"18c91384-797d-4ca1-8a29-f800994d26b7","Type":"ContainerStarted","Data":"08fb0f4b3814f3eb26b2a568a15dab936c66c22e0f142bc12dd7a55f5ce2f1d9"} Feb 25 16:15:28 crc kubenswrapper[4937]: I0225 16:15:28.033096 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7pb8b" event={"ID":"18c91384-797d-4ca1-8a29-f800994d26b7","Type":"ContainerStarted","Data":"03aa0c6f00a0642eea702175d0a7638f364dcbc8cdb10648aae33310f71398b7"} Feb 25 16:15:28 crc kubenswrapper[4937]: I0225 16:15:28.035082 4937 generic.go:334] "Generic (PLEG): container finished" podID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerID="3a6b7c334376bbde55c11093d109d72043191d8329c21d12a66602c395b1900b" exitCode=0 Feb 25 16:15:28 crc kubenswrapper[4937]: I0225 16:15:28.035112 4937 generic.go:334] "Generic (PLEG): container finished" podID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerID="405182a084014d6ab8dcf89a2de489c34e04d8390fb71d0c168acdf651d778a8" exitCode=2 Feb 25 16:15:28 crc kubenswrapper[4937]: I0225 16:15:28.035121 4937 generic.go:334] "Generic (PLEG): container finished" podID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerID="b8f9a7333dd514030ac779957f78cdc1979255b624ac44b51d557b8ee3e5d324" exitCode=0 Feb 25 16:15:28 crc kubenswrapper[4937]: I0225 16:15:28.035160 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b177e50-d02b-4342-b340-5aaae16d6d9d","Type":"ContainerDied","Data":"3a6b7c334376bbde55c11093d109d72043191d8329c21d12a66602c395b1900b"} Feb 25 16:15:28 crc kubenswrapper[4937]: I0225 16:15:28.035202 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b177e50-d02b-4342-b340-5aaae16d6d9d","Type":"ContainerDied","Data":"405182a084014d6ab8dcf89a2de489c34e04d8390fb71d0c168acdf651d778a8"} Feb 25 16:15:28 crc kubenswrapper[4937]: I0225 16:15:28.035220 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b177e50-d02b-4342-b340-5aaae16d6d9d","Type":"ContainerDied","Data":"b8f9a7333dd514030ac779957f78cdc1979255b624ac44b51d557b8ee3e5d324"} Feb 25 16:15:28 crc kubenswrapper[4937]: I0225 16:15:28.054405 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-7pb8b" podStartSLOduration=2.054382279 podStartE2EDuration="2.054382279s" podCreationTimestamp="2026-02-25 16:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:15:28.047360093 +0000 UTC m=+1779.060751983" watchObservedRunningTime="2026-02-25 16:15:28.054382279 +0000 UTC m=+1779.067774169" Feb 25 16:15:28 crc kubenswrapper[4937]: I0225 16:15:28.339661 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:15:28 crc kubenswrapper[4937]: I0225 16:15:28.411329 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-vm27p"] Feb 25 16:15:28 crc kubenswrapper[4937]: I0225 16:15:28.411685 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cd565959-vm27p" podUID="22da4be7-1bfd-4df2-a66b-8bd47f08269c" containerName="dnsmasq-dns" containerID="cri-o://4fce1db9ef8aed180141acc2e44adc14eee51286f48564b0d2fdf56c0b65e530" gracePeriod=10 Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.046269 4937 generic.go:334] "Generic (PLEG): container finished" podID="22da4be7-1bfd-4df2-a66b-8bd47f08269c" containerID="4fce1db9ef8aed180141acc2e44adc14eee51286f48564b0d2fdf56c0b65e530" exitCode=0 Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.046327 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-vm27p" event={"ID":"22da4be7-1bfd-4df2-a66b-8bd47f08269c","Type":"ContainerDied","Data":"4fce1db9ef8aed180141acc2e44adc14eee51286f48564b0d2fdf56c0b65e530"} Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.046373 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-vm27p" event={"ID":"22da4be7-1bfd-4df2-a66b-8bd47f08269c","Type":"ContainerDied","Data":"a93ffa38a03f0faea423b344ac23aeaba4b1e5a5d5d052810a0c558495c3ceac"} Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.046384 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a93ffa38a03f0faea423b344ac23aeaba4b1e5a5d5d052810a0c558495c3ceac" Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.095423 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.211220 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtk7l\" (UniqueName: \"kubernetes.io/projected/22da4be7-1bfd-4df2-a66b-8bd47f08269c-kube-api-access-wtk7l\") pod \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.211279 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-config\") pod \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.211322 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-ovsdbserver-nb\") pod \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.211360 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-dns-svc\") pod \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.211444 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-ovsdbserver-sb\") pod \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.211574 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-dns-swift-storage-0\") pod \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\" (UID: \"22da4be7-1bfd-4df2-a66b-8bd47f08269c\") " Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.247808 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22da4be7-1bfd-4df2-a66b-8bd47f08269c-kube-api-access-wtk7l" (OuterVolumeSpecName: "kube-api-access-wtk7l") pod "22da4be7-1bfd-4df2-a66b-8bd47f08269c" (UID: "22da4be7-1bfd-4df2-a66b-8bd47f08269c"). InnerVolumeSpecName "kube-api-access-wtk7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.317177 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "22da4be7-1bfd-4df2-a66b-8bd47f08269c" (UID: "22da4be7-1bfd-4df2-a66b-8bd47f08269c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.331009 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "22da4be7-1bfd-4df2-a66b-8bd47f08269c" (UID: "22da4be7-1bfd-4df2-a66b-8bd47f08269c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.331637 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.331657 4937 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.331669 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtk7l\" (UniqueName: \"kubernetes.io/projected/22da4be7-1bfd-4df2-a66b-8bd47f08269c-kube-api-access-wtk7l\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.337008 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "22da4be7-1bfd-4df2-a66b-8bd47f08269c" (UID: "22da4be7-1bfd-4df2-a66b-8bd47f08269c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.337149 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-config" (OuterVolumeSpecName: "config") pod "22da4be7-1bfd-4df2-a66b-8bd47f08269c" (UID: "22da4be7-1bfd-4df2-a66b-8bd47f08269c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.372012 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22da4be7-1bfd-4df2-a66b-8bd47f08269c" (UID: "22da4be7-1bfd-4df2-a66b-8bd47f08269c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.434733 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.434758 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:29 crc kubenswrapper[4937]: I0225 16:15:29.434766 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22da4be7-1bfd-4df2-a66b-8bd47f08269c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.059100 4937 generic.go:334] "Generic (PLEG): container finished" podID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerID="1c07f64dc1c0cb3ad965e9159165717b27afe6a5eb982f40e23424b490c3f60e" exitCode=0 Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.059163 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b177e50-d02b-4342-b340-5aaae16d6d9d","Type":"ContainerDied","Data":"1c07f64dc1c0cb3ad965e9159165717b27afe6a5eb982f40e23424b490c3f60e"} Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.059514 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-vm27p" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.104270 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-vm27p"] Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.118855 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-vm27p"] Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.368582 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:15:30 crc kubenswrapper[4937]: E0225 16:15:30.369012 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.426630 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.456902 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-scripts\") pod \"5b177e50-d02b-4342-b340-5aaae16d6d9d\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.457000 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-sg-core-conf-yaml\") pod \"5b177e50-d02b-4342-b340-5aaae16d6d9d\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.457142 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-ceilometer-tls-certs\") pod \"5b177e50-d02b-4342-b340-5aaae16d6d9d\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.457225 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-combined-ca-bundle\") pod \"5b177e50-d02b-4342-b340-5aaae16d6d9d\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.457270 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b177e50-d02b-4342-b340-5aaae16d6d9d-run-httpd\") pod \"5b177e50-d02b-4342-b340-5aaae16d6d9d\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.457311 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b177e50-d02b-4342-b340-5aaae16d6d9d-log-httpd\") pod \"5b177e50-d02b-4342-b340-5aaae16d6d9d\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.457385 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw4qp\" (UniqueName: \"kubernetes.io/projected/5b177e50-d02b-4342-b340-5aaae16d6d9d-kube-api-access-vw4qp\") pod \"5b177e50-d02b-4342-b340-5aaae16d6d9d\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.457538 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-config-data\") pod \"5b177e50-d02b-4342-b340-5aaae16d6d9d\" (UID: \"5b177e50-d02b-4342-b340-5aaae16d6d9d\") " Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.470233 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b177e50-d02b-4342-b340-5aaae16d6d9d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5b177e50-d02b-4342-b340-5aaae16d6d9d" (UID: "5b177e50-d02b-4342-b340-5aaae16d6d9d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.471270 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b177e50-d02b-4342-b340-5aaae16d6d9d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5b177e50-d02b-4342-b340-5aaae16d6d9d" (UID: "5b177e50-d02b-4342-b340-5aaae16d6d9d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.473135 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-scripts" (OuterVolumeSpecName: "scripts") pod "5b177e50-d02b-4342-b340-5aaae16d6d9d" (UID: "5b177e50-d02b-4342-b340-5aaae16d6d9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.473461 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b177e50-d02b-4342-b340-5aaae16d6d9d-kube-api-access-vw4qp" (OuterVolumeSpecName: "kube-api-access-vw4qp") pod "5b177e50-d02b-4342-b340-5aaae16d6d9d" (UID: "5b177e50-d02b-4342-b340-5aaae16d6d9d"). InnerVolumeSpecName "kube-api-access-vw4qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.503276 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5b177e50-d02b-4342-b340-5aaae16d6d9d" (UID: "5b177e50-d02b-4342-b340-5aaae16d6d9d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.537300 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5b177e50-d02b-4342-b340-5aaae16d6d9d" (UID: "5b177e50-d02b-4342-b340-5aaae16d6d9d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.561055 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.561261 4937 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.561365 4937 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.561458 4937 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b177e50-d02b-4342-b340-5aaae16d6d9d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.561605 4937 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b177e50-d02b-4342-b340-5aaae16d6d9d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.561663 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw4qp\" (UniqueName: \"kubernetes.io/projected/5b177e50-d02b-4342-b340-5aaae16d6d9d-kube-api-access-vw4qp\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.604989 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b177e50-d02b-4342-b340-5aaae16d6d9d" (UID: "5b177e50-d02b-4342-b340-5aaae16d6d9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.605507 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-config-data" (OuterVolumeSpecName: "config-data") pod "5b177e50-d02b-4342-b340-5aaae16d6d9d" (UID: "5b177e50-d02b-4342-b340-5aaae16d6d9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.662746 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:30 crc kubenswrapper[4937]: I0225 16:15:30.662987 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b177e50-d02b-4342-b340-5aaae16d6d9d-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.074867 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b177e50-d02b-4342-b340-5aaae16d6d9d","Type":"ContainerDied","Data":"caa15fef17152e718e8c5dc86381b5ed25a31f96cd1d940f35661d7bb24e4ec5"} Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.074931 4937 scope.go:117] "RemoveContainer" containerID="3a6b7c334376bbde55c11093d109d72043191d8329c21d12a66602c395b1900b" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.075067 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.113834 4937 scope.go:117] "RemoveContainer" containerID="405182a084014d6ab8dcf89a2de489c34e04d8390fb71d0c168acdf651d778a8" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.123844 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.142854 4937 scope.go:117] "RemoveContainer" containerID="b8f9a7333dd514030ac779957f78cdc1979255b624ac44b51d557b8ee3e5d324" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.149652 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.168276 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:31 crc kubenswrapper[4937]: E0225 16:15:31.168754 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22da4be7-1bfd-4df2-a66b-8bd47f08269c" containerName="dnsmasq-dns" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.168775 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="22da4be7-1bfd-4df2-a66b-8bd47f08269c" containerName="dnsmasq-dns" Feb 25 16:15:31 crc kubenswrapper[4937]: E0225 16:15:31.168799 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="ceilometer-central-agent" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.168808 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="ceilometer-central-agent" Feb 25 16:15:31 crc kubenswrapper[4937]: E0225 16:15:31.168842 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="ceilometer-notification-agent" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.168851 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="ceilometer-notification-agent" Feb 25 16:15:31 crc kubenswrapper[4937]: E0225 16:15:31.168871 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22da4be7-1bfd-4df2-a66b-8bd47f08269c" containerName="init" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.168878 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="22da4be7-1bfd-4df2-a66b-8bd47f08269c" containerName="init" Feb 25 16:15:31 crc kubenswrapper[4937]: E0225 16:15:31.168894 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="sg-core" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.168900 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="sg-core" Feb 25 16:15:31 crc kubenswrapper[4937]: E0225 16:15:31.168917 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="proxy-httpd" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.168924 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="proxy-httpd" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.169182 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="proxy-httpd" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.169199 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="ceilometer-central-agent" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.169217 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="ceilometer-notification-agent" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.169241 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="22da4be7-1bfd-4df2-a66b-8bd47f08269c" containerName="dnsmasq-dns" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.169256 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" containerName="sg-core" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.171447 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.175962 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.176133 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.185251 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.186599 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.191230 4937 scope.go:117] "RemoveContainer" containerID="1c07f64dc1c0cb3ad965e9159165717b27afe6a5eb982f40e23424b490c3f60e" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.277897 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5xnd\" (UniqueName: \"kubernetes.io/projected/1c5876dc-2444-4491-8ab8-3360d3f4a84c-kube-api-access-w5xnd\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.278024 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-scripts\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.278107 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.278129 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.278155 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-config-data\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.278193 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c5876dc-2444-4491-8ab8-3360d3f4a84c-run-httpd\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.278214 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.278300 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c5876dc-2444-4491-8ab8-3360d3f4a84c-log-httpd\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.380765 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c5876dc-2444-4491-8ab8-3360d3f4a84c-log-httpd\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.381246 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5xnd\" (UniqueName: \"kubernetes.io/projected/1c5876dc-2444-4491-8ab8-3360d3f4a84c-kube-api-access-w5xnd\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.381540 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-scripts\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.381792 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.381961 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.382090 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-config-data\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.382233 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c5876dc-2444-4491-8ab8-3360d3f4a84c-run-httpd\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.382373 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.386589 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c5876dc-2444-4491-8ab8-3360d3f4a84c-log-httpd\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.388611 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c5876dc-2444-4491-8ab8-3360d3f4a84c-run-httpd\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.390226 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.391774 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.392584 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-config-data\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.394381 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-scripts\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.394593 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22da4be7-1bfd-4df2-a66b-8bd47f08269c" path="/var/lib/kubelet/pods/22da4be7-1bfd-4df2-a66b-8bd47f08269c/volumes" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.395916 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b177e50-d02b-4342-b340-5aaae16d6d9d" path="/var/lib/kubelet/pods/5b177e50-d02b-4342-b340-5aaae16d6d9d/volumes" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.399159 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.404631 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5xnd\" (UniqueName: \"kubernetes.io/projected/1c5876dc-2444-4491-8ab8-3360d3f4a84c-kube-api-access-w5xnd\") pod \"ceilometer-0\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " pod="openstack/ceilometer-0" Feb 25 16:15:31 crc kubenswrapper[4937]: I0225 16:15:31.496000 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:15:32 crc kubenswrapper[4937]: I0225 16:15:32.038768 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:15:32 crc kubenswrapper[4937]: I0225 16:15:32.088296 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c5876dc-2444-4491-8ab8-3360d3f4a84c","Type":"ContainerStarted","Data":"38fb3c544f77c94d0c9f8954f6738caab19bc5e19992e945b72657058f7bf42d"} Feb 25 16:15:33 crc kubenswrapper[4937]: I0225 16:15:33.105400 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c5876dc-2444-4491-8ab8-3360d3f4a84c","Type":"ContainerStarted","Data":"3fd5abba776fd0bb13bae5f9ebc8fa3d982275f3d6e89ad2da44ef74dd961a16"} Feb 25 16:15:33 crc kubenswrapper[4937]: I0225 16:15:33.109565 4937 generic.go:334] "Generic (PLEG): container finished" podID="18c91384-797d-4ca1-8a29-f800994d26b7" containerID="08fb0f4b3814f3eb26b2a568a15dab936c66c22e0f142bc12dd7a55f5ce2f1d9" exitCode=0 Feb 25 16:15:33 crc kubenswrapper[4937]: I0225 16:15:33.109604 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7pb8b" event={"ID":"18c91384-797d-4ca1-8a29-f800994d26b7","Type":"ContainerDied","Data":"08fb0f4b3814f3eb26b2a568a15dab936c66c22e0f142bc12dd7a55f5ce2f1d9"} Feb 25 16:15:34 crc kubenswrapper[4937]: I0225 16:15:34.129886 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c5876dc-2444-4491-8ab8-3360d3f4a84c","Type":"ContainerStarted","Data":"a41cbb9ec10ee5a3bf122dd49a0838b035020c14d46d84017bedcabe4b3dbd22"} Feb 25 16:15:34 crc kubenswrapper[4937]: I0225 16:15:34.630167 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:34 crc kubenswrapper[4937]: I0225 16:15:34.652221 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-config-data\") pod \"18c91384-797d-4ca1-8a29-f800994d26b7\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " Feb 25 16:15:34 crc kubenswrapper[4937]: I0225 16:15:34.652302 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crkzk\" (UniqueName: \"kubernetes.io/projected/18c91384-797d-4ca1-8a29-f800994d26b7-kube-api-access-crkzk\") pod \"18c91384-797d-4ca1-8a29-f800994d26b7\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " Feb 25 16:15:34 crc kubenswrapper[4937]: I0225 16:15:34.652595 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-scripts\") pod \"18c91384-797d-4ca1-8a29-f800994d26b7\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " Feb 25 16:15:34 crc kubenswrapper[4937]: I0225 16:15:34.652669 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-combined-ca-bundle\") pod \"18c91384-797d-4ca1-8a29-f800994d26b7\" (UID: \"18c91384-797d-4ca1-8a29-f800994d26b7\") " Feb 25 16:15:34 crc kubenswrapper[4937]: I0225 16:15:34.657914 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-scripts" (OuterVolumeSpecName: "scripts") pod "18c91384-797d-4ca1-8a29-f800994d26b7" (UID: "18c91384-797d-4ca1-8a29-f800994d26b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:34 crc kubenswrapper[4937]: I0225 16:15:34.676835 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c91384-797d-4ca1-8a29-f800994d26b7-kube-api-access-crkzk" (OuterVolumeSpecName: "kube-api-access-crkzk") pod "18c91384-797d-4ca1-8a29-f800994d26b7" (UID: "18c91384-797d-4ca1-8a29-f800994d26b7"). InnerVolumeSpecName "kube-api-access-crkzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:15:34 crc kubenswrapper[4937]: I0225 16:15:34.693313 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-config-data" (OuterVolumeSpecName: "config-data") pod "18c91384-797d-4ca1-8a29-f800994d26b7" (UID: "18c91384-797d-4ca1-8a29-f800994d26b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:34 crc kubenswrapper[4937]: I0225 16:15:34.708719 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18c91384-797d-4ca1-8a29-f800994d26b7" (UID: "18c91384-797d-4ca1-8a29-f800994d26b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:34 crc kubenswrapper[4937]: I0225 16:15:34.755552 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:34 crc kubenswrapper[4937]: I0225 16:15:34.755585 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:34 crc kubenswrapper[4937]: I0225 16:15:34.755597 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c91384-797d-4ca1-8a29-f800994d26b7-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:34 crc kubenswrapper[4937]: I0225 16:15:34.755609 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crkzk\" (UniqueName: \"kubernetes.io/projected/18c91384-797d-4ca1-8a29-f800994d26b7-kube-api-access-crkzk\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:35 crc kubenswrapper[4937]: I0225 16:15:35.147770 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c5876dc-2444-4491-8ab8-3360d3f4a84c","Type":"ContainerStarted","Data":"6c74f45829c8dd89e12ee5508f75537b3a7afd7ce0829aa3b069a667403af318"} Feb 25 16:15:35 crc kubenswrapper[4937]: I0225 16:15:35.149844 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7pb8b" event={"ID":"18c91384-797d-4ca1-8a29-f800994d26b7","Type":"ContainerDied","Data":"03aa0c6f00a0642eea702175d0a7638f364dcbc8cdb10648aae33310f71398b7"} Feb 25 16:15:35 crc kubenswrapper[4937]: I0225 16:15:35.149883 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03aa0c6f00a0642eea702175d0a7638f364dcbc8cdb10648aae33310f71398b7" Feb 25 16:15:35 crc kubenswrapper[4937]: I0225 16:15:35.149948 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7pb8b" Feb 25 16:15:35 crc kubenswrapper[4937]: I0225 16:15:35.311348 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:15:35 crc kubenswrapper[4937]: I0225 16:15:35.311675 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5d227184-5e5f-4de7-b13a-1d38af727834" containerName="nova-api-log" containerID="cri-o://92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010" gracePeriod=30 Feb 25 16:15:35 crc kubenswrapper[4937]: I0225 16:15:35.311761 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5d227184-5e5f-4de7-b13a-1d38af727834" containerName="nova-api-api" containerID="cri-o://8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721" gracePeriod=30 Feb 25 16:15:35 crc kubenswrapper[4937]: I0225 16:15:35.339224 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:15:35 crc kubenswrapper[4937]: I0225 16:15:35.339707 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8408967e-27f5-424c-9a30-be0b1b30812b" containerName="nova-scheduler-scheduler" containerID="cri-o://0877eab7cd85879652c63807278dfdbd811256eab8030a4026359b084e89a79e" gracePeriod=30 Feb 25 16:15:35 crc kubenswrapper[4937]: I0225 16:15:35.363340 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:15:35 crc kubenswrapper[4937]: I0225 16:15:35.363631 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" containerName="nova-metadata-log" containerID="cri-o://6ca7c5615756355856ccb79658ed2df5e622cdfb86e80a8345524cb17206fcb1" gracePeriod=30 Feb 25 16:15:35 crc kubenswrapper[4937]: I0225 16:15:35.363960 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" containerName="nova-metadata-metadata" containerID="cri-o://4c5f94129e556d2f5cb898836f8defef125f08ff7b6c9b2bdba3438fc000d9eb" gracePeriod=30 Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.037444 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.116495 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qwfb\" (UniqueName: \"kubernetes.io/projected/5d227184-5e5f-4de7-b13a-1d38af727834-kube-api-access-8qwfb\") pod \"5d227184-5e5f-4de7-b13a-1d38af727834\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.116814 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-internal-tls-certs\") pod \"5d227184-5e5f-4de7-b13a-1d38af727834\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.116872 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-config-data\") pod \"5d227184-5e5f-4de7-b13a-1d38af727834\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.116973 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-public-tls-certs\") pod \"5d227184-5e5f-4de7-b13a-1d38af727834\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.117115 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-combined-ca-bundle\") pod \"5d227184-5e5f-4de7-b13a-1d38af727834\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.117247 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d227184-5e5f-4de7-b13a-1d38af727834-logs\") pod \"5d227184-5e5f-4de7-b13a-1d38af727834\" (UID: \"5d227184-5e5f-4de7-b13a-1d38af727834\") " Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.117711 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d227184-5e5f-4de7-b13a-1d38af727834-logs" (OuterVolumeSpecName: "logs") pod "5d227184-5e5f-4de7-b13a-1d38af727834" (UID: "5d227184-5e5f-4de7-b13a-1d38af727834"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.118283 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d227184-5e5f-4de7-b13a-1d38af727834-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.132043 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d227184-5e5f-4de7-b13a-1d38af727834-kube-api-access-8qwfb" (OuterVolumeSpecName: "kube-api-access-8qwfb") pod "5d227184-5e5f-4de7-b13a-1d38af727834" (UID: "5d227184-5e5f-4de7-b13a-1d38af727834"). InnerVolumeSpecName "kube-api-access-8qwfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.146228 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-config-data" (OuterVolumeSpecName: "config-data") pod "5d227184-5e5f-4de7-b13a-1d38af727834" (UID: "5d227184-5e5f-4de7-b13a-1d38af727834"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.165953 4937 generic.go:334] "Generic (PLEG): container finished" podID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" containerID="6ca7c5615756355856ccb79658ed2df5e622cdfb86e80a8345524cb17206fcb1" exitCode=143 Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.166012 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69898463-2d64-46cb-8d7a-ff187bb8b0a1","Type":"ContainerDied","Data":"6ca7c5615756355856ccb79658ed2df5e622cdfb86e80a8345524cb17206fcb1"} Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.168801 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d227184-5e5f-4de7-b13a-1d38af727834" (UID: "5d227184-5e5f-4de7-b13a-1d38af727834"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.169795 4937 generic.go:334] "Generic (PLEG): container finished" podID="5d227184-5e5f-4de7-b13a-1d38af727834" containerID="8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721" exitCode=0 Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.169815 4937 generic.go:334] "Generic (PLEG): container finished" podID="5d227184-5e5f-4de7-b13a-1d38af727834" containerID="92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010" exitCode=143 Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.169845 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d227184-5e5f-4de7-b13a-1d38af727834","Type":"ContainerDied","Data":"8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721"} Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.169862 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d227184-5e5f-4de7-b13a-1d38af727834","Type":"ContainerDied","Data":"92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010"} Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.169871 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d227184-5e5f-4de7-b13a-1d38af727834","Type":"ContainerDied","Data":"e71314ba221aed314943e06e65f6e6675f5008e2ff2df0d582bd03afc2ec4235"} Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.169886 4937 scope.go:117] "RemoveContainer" containerID="8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.170001 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.175400 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c5876dc-2444-4491-8ab8-3360d3f4a84c","Type":"ContainerStarted","Data":"9ad5a3fa9418ea7024edce8e14f568eb427b36aeb0f2ad9bc6c0b2e921a1646d"} Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.176583 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.193263 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5d227184-5e5f-4de7-b13a-1d38af727834" (UID: "5d227184-5e5f-4de7-b13a-1d38af727834"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.195587 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5d227184-5e5f-4de7-b13a-1d38af727834" (UID: "5d227184-5e5f-4de7-b13a-1d38af727834"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.201536 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.393800945 podStartE2EDuration="5.20151582s" podCreationTimestamp="2026-02-25 16:15:31 +0000 UTC" firstStartedPulling="2026-02-25 16:15:32.047611966 +0000 UTC m=+1783.061003856" lastFinishedPulling="2026-02-25 16:15:35.855326841 +0000 UTC m=+1786.868718731" observedRunningTime="2026-02-25 16:15:36.197795927 +0000 UTC m=+1787.211187837" watchObservedRunningTime="2026-02-25 16:15:36.20151582 +0000 UTC m=+1787.214907710" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.220691 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.220727 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qwfb\" (UniqueName: \"kubernetes.io/projected/5d227184-5e5f-4de7-b13a-1d38af727834-kube-api-access-8qwfb\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.220737 4937 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.220747 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.220759 4937 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d227184-5e5f-4de7-b13a-1d38af727834-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.334748 4937 scope.go:117] "RemoveContainer" containerID="92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.357438 4937 scope.go:117] "RemoveContainer" containerID="8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721" Feb 25 16:15:36 crc kubenswrapper[4937]: E0225 16:15:36.357945 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721\": container with ID starting with 8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721 not found: ID does not exist" containerID="8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.357979 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721"} err="failed to get container status \"8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721\": rpc error: code = NotFound desc = could not find container \"8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721\": container with ID starting with 8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721 not found: ID does not exist" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.357999 4937 scope.go:117] "RemoveContainer" containerID="92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010" Feb 25 16:15:36 crc kubenswrapper[4937]: E0225 16:15:36.358467 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010\": container with ID starting with 92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010 not found: ID does not exist" containerID="92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.358534 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010"} err="failed to get container status \"92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010\": rpc error: code = NotFound desc = could not find container \"92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010\": container with ID starting with 92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010 not found: ID does not exist" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.358567 4937 scope.go:117] "RemoveContainer" containerID="8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.358915 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721"} err="failed to get container status \"8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721\": rpc error: code = NotFound desc = could not find container \"8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721\": container with ID starting with 8ad2d11b0ce9a71b2035f31b8fa85326cc53f7de0257e16e81aa329e02818721 not found: ID does not exist" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.358934 4937 scope.go:117] "RemoveContainer" containerID="92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.359155 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010"} err="failed to get container status \"92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010\": rpc error: code = NotFound desc = could not find container \"92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010\": container with ID starting with 92a19430c54233d24c3f6447abbc1ebc2b1d9356d899ee5e607ce0b596f10010 not found: ID does not exist" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.516033 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.527780 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.546204 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 25 16:15:36 crc kubenswrapper[4937]: E0225 16:15:36.546713 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d227184-5e5f-4de7-b13a-1d38af727834" containerName="nova-api-log" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.546730 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d227184-5e5f-4de7-b13a-1d38af727834" containerName="nova-api-log" Feb 25 16:15:36 crc kubenswrapper[4937]: E0225 16:15:36.546747 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c91384-797d-4ca1-8a29-f800994d26b7" containerName="nova-manage" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.546754 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c91384-797d-4ca1-8a29-f800994d26b7" containerName="nova-manage" Feb 25 16:15:36 crc kubenswrapper[4937]: E0225 16:15:36.546783 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d227184-5e5f-4de7-b13a-1d38af727834" containerName="nova-api-api" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.546790 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d227184-5e5f-4de7-b13a-1d38af727834" containerName="nova-api-api" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.546986 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c91384-797d-4ca1-8a29-f800994d26b7" containerName="nova-manage" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.547004 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d227184-5e5f-4de7-b13a-1d38af727834" containerName="nova-api-log" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.547019 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d227184-5e5f-4de7-b13a-1d38af727834" containerName="nova-api-api" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.548184 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.550396 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.551260 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.553810 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.561458 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.628100 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.628161 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-logs\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.628202 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79xjf\" (UniqueName: \"kubernetes.io/projected/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-kube-api-access-79xjf\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.628242 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.628503 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-public-tls-certs\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.628722 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-config-data\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.730089 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.730179 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-public-tls-certs\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.730234 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-config-data\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.730296 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.730318 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-logs\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.730352 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79xjf\" (UniqueName: \"kubernetes.io/projected/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-kube-api-access-79xjf\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.730938 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-logs\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.735737 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-public-tls-certs\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.735954 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.736538 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.738719 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-config-data\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.758914 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79xjf\" (UniqueName: \"kubernetes.io/projected/e90a8e88-5fc8-48fe-af70-c6f6553d8b62-kube-api-access-79xjf\") pod \"nova-api-0\" (UID: \"e90a8e88-5fc8-48fe-af70-c6f6553d8b62\") " pod="openstack/nova-api-0" Feb 25 16:15:36 crc kubenswrapper[4937]: I0225 16:15:36.870062 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 16:15:37 crc kubenswrapper[4937]: I0225 16:15:37.382364 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d227184-5e5f-4de7-b13a-1d38af727834" path="/var/lib/kubelet/pods/5d227184-5e5f-4de7-b13a-1d38af727834/volumes" Feb 25 16:15:37 crc kubenswrapper[4937]: I0225 16:15:37.413780 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 16:15:37 crc kubenswrapper[4937]: I0225 16:15:37.703754 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 16:15:37 crc kubenswrapper[4937]: I0225 16:15:37.754847 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-combined-ca-bundle\") pod \"8408967e-27f5-424c-9a30-be0b1b30812b\" (UID: \"8408967e-27f5-424c-9a30-be0b1b30812b\") " Feb 25 16:15:37 crc kubenswrapper[4937]: I0225 16:15:37.755050 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69gpn\" (UniqueName: \"kubernetes.io/projected/8408967e-27f5-424c-9a30-be0b1b30812b-kube-api-access-69gpn\") pod \"8408967e-27f5-424c-9a30-be0b1b30812b\" (UID: \"8408967e-27f5-424c-9a30-be0b1b30812b\") " Feb 25 16:15:37 crc kubenswrapper[4937]: I0225 16:15:37.755082 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-config-data\") pod \"8408967e-27f5-424c-9a30-be0b1b30812b\" (UID: \"8408967e-27f5-424c-9a30-be0b1b30812b\") " Feb 25 16:15:37 crc kubenswrapper[4937]: I0225 16:15:37.759355 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8408967e-27f5-424c-9a30-be0b1b30812b-kube-api-access-69gpn" (OuterVolumeSpecName: "kube-api-access-69gpn") pod "8408967e-27f5-424c-9a30-be0b1b30812b" (UID: "8408967e-27f5-424c-9a30-be0b1b30812b"). InnerVolumeSpecName "kube-api-access-69gpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:15:37 crc kubenswrapper[4937]: E0225 16:15:37.795625 4937 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-config-data podName:8408967e-27f5-424c-9a30-be0b1b30812b nodeName:}" failed. No retries permitted until 2026-02-25 16:15:38.295592561 +0000 UTC m=+1789.308984461 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-config-data") pod "8408967e-27f5-424c-9a30-be0b1b30812b" (UID: "8408967e-27f5-424c-9a30-be0b1b30812b") : error deleting /var/lib/kubelet/pods/8408967e-27f5-424c-9a30-be0b1b30812b/volume-subpaths: remove /var/lib/kubelet/pods/8408967e-27f5-424c-9a30-be0b1b30812b/volume-subpaths: no such file or directory Feb 25 16:15:37 crc kubenswrapper[4937]: I0225 16:15:37.799378 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8408967e-27f5-424c-9a30-be0b1b30812b" (UID: "8408967e-27f5-424c-9a30-be0b1b30812b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:37 crc kubenswrapper[4937]: I0225 16:15:37.858225 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69gpn\" (UniqueName: \"kubernetes.io/projected/8408967e-27f5-424c-9a30-be0b1b30812b-kube-api-access-69gpn\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:37 crc kubenswrapper[4937]: I0225 16:15:37.858252 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.196404 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e90a8e88-5fc8-48fe-af70-c6f6553d8b62","Type":"ContainerStarted","Data":"52d07cad3a43322b1c37775192cfc78e3d51a7f52d28a57e3329e97e28c5dea2"} Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.196442 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e90a8e88-5fc8-48fe-af70-c6f6553d8b62","Type":"ContainerStarted","Data":"e7ca25d96cd3baa4d9ed558478d2f54f68d52652b8572d614150168daa4032ad"} Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.196454 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e90a8e88-5fc8-48fe-af70-c6f6553d8b62","Type":"ContainerStarted","Data":"e8ce3ae2a9e0eac305d2f53ed492fdb4990ee8e2a6bde7a96ccd703ab6e3aedf"} Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.197996 4937 generic.go:334] "Generic (PLEG): container finished" podID="8408967e-27f5-424c-9a30-be0b1b30812b" containerID="0877eab7cd85879652c63807278dfdbd811256eab8030a4026359b084e89a79e" exitCode=0 Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.198052 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8408967e-27f5-424c-9a30-be0b1b30812b","Type":"ContainerDied","Data":"0877eab7cd85879652c63807278dfdbd811256eab8030a4026359b084e89a79e"} Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.198097 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.198104 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8408967e-27f5-424c-9a30-be0b1b30812b","Type":"ContainerDied","Data":"d839301cbaab0a22faca8522bec57f66133ce7326e2a4a3ff9a6088fdc4cde3b"} Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.198117 4937 scope.go:117] "RemoveContainer" containerID="0877eab7cd85879652c63807278dfdbd811256eab8030a4026359b084e89a79e" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.226685 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.226671328 podStartE2EDuration="2.226671328s" podCreationTimestamp="2026-02-25 16:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:15:38.221942969 +0000 UTC m=+1789.235334859" watchObservedRunningTime="2026-02-25 16:15:38.226671328 +0000 UTC m=+1789.240063218" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.237735 4937 scope.go:117] "RemoveContainer" containerID="0877eab7cd85879652c63807278dfdbd811256eab8030a4026359b084e89a79e" Feb 25 16:15:38 crc kubenswrapper[4937]: E0225 16:15:38.238237 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0877eab7cd85879652c63807278dfdbd811256eab8030a4026359b084e89a79e\": container with ID starting with 0877eab7cd85879652c63807278dfdbd811256eab8030a4026359b084e89a79e not found: ID does not exist" containerID="0877eab7cd85879652c63807278dfdbd811256eab8030a4026359b084e89a79e" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.238278 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0877eab7cd85879652c63807278dfdbd811256eab8030a4026359b084e89a79e"} err="failed to get container status \"0877eab7cd85879652c63807278dfdbd811256eab8030a4026359b084e89a79e\": rpc error: code = NotFound desc = could not find container \"0877eab7cd85879652c63807278dfdbd811256eab8030a4026359b084e89a79e\": container with ID starting with 0877eab7cd85879652c63807278dfdbd811256eab8030a4026359b084e89a79e not found: ID does not exist" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.375067 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-config-data\") pod \"8408967e-27f5-424c-9a30-be0b1b30812b\" (UID: \"8408967e-27f5-424c-9a30-be0b1b30812b\") " Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.380535 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-config-data" (OuterVolumeSpecName: "config-data") pod "8408967e-27f5-424c-9a30-be0b1b30812b" (UID: "8408967e-27f5-424c-9a30-be0b1b30812b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.477975 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8408967e-27f5-424c-9a30-be0b1b30812b-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.507261 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.229:8775/\": read tcp 10.217.0.2:53430->10.217.0.229:8775: read: connection reset by peer" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.507285 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.229:8775/\": read tcp 10.217.0.2:53438->10.217.0.229:8775: read: connection reset by peer" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.664159 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.685449 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.696479 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:15:38 crc kubenswrapper[4937]: E0225 16:15:38.697020 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8408967e-27f5-424c-9a30-be0b1b30812b" containerName="nova-scheduler-scheduler" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.697034 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8408967e-27f5-424c-9a30-be0b1b30812b" containerName="nova-scheduler-scheduler" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.697248 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="8408967e-27f5-424c-9a30-be0b1b30812b" containerName="nova-scheduler-scheduler" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.698123 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.703867 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.705568 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.785194 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqf5n\" (UniqueName: \"kubernetes.io/projected/987df1b8-49a2-4ec2-b92d-64619d55b516-kube-api-access-qqf5n\") pod \"nova-scheduler-0\" (UID: \"987df1b8-49a2-4ec2-b92d-64619d55b516\") " pod="openstack/nova-scheduler-0" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.788960 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987df1b8-49a2-4ec2-b92d-64619d55b516-config-data\") pod \"nova-scheduler-0\" (UID: \"987df1b8-49a2-4ec2-b92d-64619d55b516\") " pod="openstack/nova-scheduler-0" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.789080 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987df1b8-49a2-4ec2-b92d-64619d55b516-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"987df1b8-49a2-4ec2-b92d-64619d55b516\") " pod="openstack/nova-scheduler-0" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.891272 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987df1b8-49a2-4ec2-b92d-64619d55b516-config-data\") pod \"nova-scheduler-0\" (UID: \"987df1b8-49a2-4ec2-b92d-64619d55b516\") " pod="openstack/nova-scheduler-0" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.891368 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987df1b8-49a2-4ec2-b92d-64619d55b516-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"987df1b8-49a2-4ec2-b92d-64619d55b516\") " pod="openstack/nova-scheduler-0" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.891419 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqf5n\" (UniqueName: \"kubernetes.io/projected/987df1b8-49a2-4ec2-b92d-64619d55b516-kube-api-access-qqf5n\") pod \"nova-scheduler-0\" (UID: \"987df1b8-49a2-4ec2-b92d-64619d55b516\") " pod="openstack/nova-scheduler-0" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.896464 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987df1b8-49a2-4ec2-b92d-64619d55b516-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"987df1b8-49a2-4ec2-b92d-64619d55b516\") " pod="openstack/nova-scheduler-0" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.909040 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987df1b8-49a2-4ec2-b92d-64619d55b516-config-data\") pod \"nova-scheduler-0\" (UID: \"987df1b8-49a2-4ec2-b92d-64619d55b516\") " pod="openstack/nova-scheduler-0" Feb 25 16:15:38 crc kubenswrapper[4937]: I0225 16:15:38.927971 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqf5n\" (UniqueName: \"kubernetes.io/projected/987df1b8-49a2-4ec2-b92d-64619d55b516-kube-api-access-qqf5n\") pod \"nova-scheduler-0\" (UID: \"987df1b8-49a2-4ec2-b92d-64619d55b516\") " pod="openstack/nova-scheduler-0" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.083932 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.215881 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.218082 4937 generic.go:334] "Generic (PLEG): container finished" podID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" containerID="4c5f94129e556d2f5cb898836f8defef125f08ff7b6c9b2bdba3438fc000d9eb" exitCode=0 Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.218335 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69898463-2d64-46cb-8d7a-ff187bb8b0a1","Type":"ContainerDied","Data":"4c5f94129e556d2f5cb898836f8defef125f08ff7b6c9b2bdba3438fc000d9eb"} Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.219110 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69898463-2d64-46cb-8d7a-ff187bb8b0a1","Type":"ContainerDied","Data":"bef984373fc8c782dab8ea82cc314d06825e0005473442a64da5691278e51ffc"} Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.219140 4937 scope.go:117] "RemoveContainer" containerID="4c5f94129e556d2f5cb898836f8defef125f08ff7b6c9b2bdba3438fc000d9eb" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.268823 4937 scope.go:117] "RemoveContainer" containerID="6ca7c5615756355856ccb79658ed2df5e622cdfb86e80a8345524cb17206fcb1" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.299540 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5tc7\" (UniqueName: \"kubernetes.io/projected/69898463-2d64-46cb-8d7a-ff187bb8b0a1-kube-api-access-m5tc7\") pod \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.299660 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-config-data\") pod \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.299762 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69898463-2d64-46cb-8d7a-ff187bb8b0a1-logs\") pod \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.299958 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-nova-metadata-tls-certs\") pod \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.299999 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-combined-ca-bundle\") pod \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\" (UID: \"69898463-2d64-46cb-8d7a-ff187bb8b0a1\") " Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.301590 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69898463-2d64-46cb-8d7a-ff187bb8b0a1-logs" (OuterVolumeSpecName: "logs") pod "69898463-2d64-46cb-8d7a-ff187bb8b0a1" (UID: "69898463-2d64-46cb-8d7a-ff187bb8b0a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.313545 4937 scope.go:117] "RemoveContainer" containerID="4c5f94129e556d2f5cb898836f8defef125f08ff7b6c9b2bdba3438fc000d9eb" Feb 25 16:15:39 crc kubenswrapper[4937]: E0225 16:15:39.316205 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c5f94129e556d2f5cb898836f8defef125f08ff7b6c9b2bdba3438fc000d9eb\": container with ID starting with 4c5f94129e556d2f5cb898836f8defef125f08ff7b6c9b2bdba3438fc000d9eb not found: ID does not exist" containerID="4c5f94129e556d2f5cb898836f8defef125f08ff7b6c9b2bdba3438fc000d9eb" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.316236 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c5f94129e556d2f5cb898836f8defef125f08ff7b6c9b2bdba3438fc000d9eb"} err="failed to get container status \"4c5f94129e556d2f5cb898836f8defef125f08ff7b6c9b2bdba3438fc000d9eb\": rpc error: code = NotFound desc = could not find container \"4c5f94129e556d2f5cb898836f8defef125f08ff7b6c9b2bdba3438fc000d9eb\": container with ID starting with 4c5f94129e556d2f5cb898836f8defef125f08ff7b6c9b2bdba3438fc000d9eb not found: ID does not exist" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.316257 4937 scope.go:117] "RemoveContainer" containerID="6ca7c5615756355856ccb79658ed2df5e622cdfb86e80a8345524cb17206fcb1" Feb 25 16:15:39 crc kubenswrapper[4937]: E0225 16:15:39.317323 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ca7c5615756355856ccb79658ed2df5e622cdfb86e80a8345524cb17206fcb1\": container with ID starting with 6ca7c5615756355856ccb79658ed2df5e622cdfb86e80a8345524cb17206fcb1 not found: ID does not exist" containerID="6ca7c5615756355856ccb79658ed2df5e622cdfb86e80a8345524cb17206fcb1" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.317347 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ca7c5615756355856ccb79658ed2df5e622cdfb86e80a8345524cb17206fcb1"} err="failed to get container status \"6ca7c5615756355856ccb79658ed2df5e622cdfb86e80a8345524cb17206fcb1\": rpc error: code = NotFound desc = could not find container \"6ca7c5615756355856ccb79658ed2df5e622cdfb86e80a8345524cb17206fcb1\": container with ID starting with 6ca7c5615756355856ccb79658ed2df5e622cdfb86e80a8345524cb17206fcb1 not found: ID does not exist" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.317957 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69898463-2d64-46cb-8d7a-ff187bb8b0a1-kube-api-access-m5tc7" (OuterVolumeSpecName: "kube-api-access-m5tc7") pod "69898463-2d64-46cb-8d7a-ff187bb8b0a1" (UID: "69898463-2d64-46cb-8d7a-ff187bb8b0a1"). InnerVolumeSpecName "kube-api-access-m5tc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.354995 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-config-data" (OuterVolumeSpecName: "config-data") pod "69898463-2d64-46cb-8d7a-ff187bb8b0a1" (UID: "69898463-2d64-46cb-8d7a-ff187bb8b0a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.377030 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69898463-2d64-46cb-8d7a-ff187bb8b0a1" (UID: "69898463-2d64-46cb-8d7a-ff187bb8b0a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.393347 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8408967e-27f5-424c-9a30-be0b1b30812b" path="/var/lib/kubelet/pods/8408967e-27f5-424c-9a30-be0b1b30812b/volumes" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.403729 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69898463-2d64-46cb-8d7a-ff187bb8b0a1-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.403768 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.403778 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5tc7\" (UniqueName: \"kubernetes.io/projected/69898463-2d64-46cb-8d7a-ff187bb8b0a1-kube-api-access-m5tc7\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.403786 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.419683 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "69898463-2d64-46cb-8d7a-ff187bb8b0a1" (UID: "69898463-2d64-46cb-8d7a-ff187bb8b0a1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.506208 4937 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69898463-2d64-46cb-8d7a-ff187bb8b0a1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:15:39 crc kubenswrapper[4937]: W0225 16:15:39.628432 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod987df1b8_49a2_4ec2_b92d_64619d55b516.slice/crio-0dc9aa092fee6a37dd4a9f5b40e878a0b8cb64a377f1b8333a0f569327581620 WatchSource:0}: Error finding container 0dc9aa092fee6a37dd4a9f5b40e878a0b8cb64a377f1b8333a0f569327581620: Status 404 returned error can't find the container with id 0dc9aa092fee6a37dd4a9f5b40e878a0b8cb64a377f1b8333a0f569327581620 Feb 25 16:15:39 crc kubenswrapper[4937]: I0225 16:15:39.637799 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.235382 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"987df1b8-49a2-4ec2-b92d-64619d55b516","Type":"ContainerStarted","Data":"fd2e7db15b9053e688cdba404c642f5f3faeb714474e17b9aa7629de69cf88c3"} Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.235423 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"987df1b8-49a2-4ec2-b92d-64619d55b516","Type":"ContainerStarted","Data":"0dc9aa092fee6a37dd4a9f5b40e878a0b8cb64a377f1b8333a0f569327581620"} Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.240341 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.287446 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.2874252 podStartE2EDuration="2.2874252s" podCreationTimestamp="2026-02-25 16:15:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:15:40.262571596 +0000 UTC m=+1791.275963496" watchObservedRunningTime="2026-02-25 16:15:40.2874252 +0000 UTC m=+1791.300817090" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.311761 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.336844 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.347232 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:15:40 crc kubenswrapper[4937]: E0225 16:15:40.347917 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" containerName="nova-metadata-log" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.347947 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" containerName="nova-metadata-log" Feb 25 16:15:40 crc kubenswrapper[4937]: E0225 16:15:40.347989 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" containerName="nova-metadata-metadata" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.348000 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" containerName="nova-metadata-metadata" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.348358 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" containerName="nova-metadata-metadata" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.348418 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" containerName="nova-metadata-log" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.350144 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.353040 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.353324 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.367895 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.425595 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr7d2\" (UniqueName: \"kubernetes.io/projected/9d7981f5-bfaf-41a0-a577-ab25b40dc375-kube-api-access-nr7d2\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.425672 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7981f5-bfaf-41a0-a577-ab25b40dc375-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.425702 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7981f5-bfaf-41a0-a577-ab25b40dc375-config-data\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.426085 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d7981f5-bfaf-41a0-a577-ab25b40dc375-logs\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.426242 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7981f5-bfaf-41a0-a577-ab25b40dc375-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.528605 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr7d2\" (UniqueName: \"kubernetes.io/projected/9d7981f5-bfaf-41a0-a577-ab25b40dc375-kube-api-access-nr7d2\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.528662 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7981f5-bfaf-41a0-a577-ab25b40dc375-config-data\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.528680 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7981f5-bfaf-41a0-a577-ab25b40dc375-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.528721 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d7981f5-bfaf-41a0-a577-ab25b40dc375-logs\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.528785 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7981f5-bfaf-41a0-a577-ab25b40dc375-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.529207 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d7981f5-bfaf-41a0-a577-ab25b40dc375-logs\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.542141 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7981f5-bfaf-41a0-a577-ab25b40dc375-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.542811 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7981f5-bfaf-41a0-a577-ab25b40dc375-config-data\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.543940 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7981f5-bfaf-41a0-a577-ab25b40dc375-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.548169 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr7d2\" (UniqueName: \"kubernetes.io/projected/9d7981f5-bfaf-41a0-a577-ab25b40dc375-kube-api-access-nr7d2\") pod \"nova-metadata-0\" (UID: \"9d7981f5-bfaf-41a0-a577-ab25b40dc375\") " pod="openstack/nova-metadata-0" Feb 25 16:15:40 crc kubenswrapper[4937]: I0225 16:15:40.679017 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 16:15:41 crc kubenswrapper[4937]: I0225 16:15:41.175406 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 16:15:41 crc kubenswrapper[4937]: I0225 16:15:41.261632 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d7981f5-bfaf-41a0-a577-ab25b40dc375","Type":"ContainerStarted","Data":"991cad49024636ad9410d53e05c2a8fe363a29b4ee83addbc5ee47817bb8b317"} Feb 25 16:15:41 crc kubenswrapper[4937]: I0225 16:15:41.385124 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69898463-2d64-46cb-8d7a-ff187bb8b0a1" path="/var/lib/kubelet/pods/69898463-2d64-46cb-8d7a-ff187bb8b0a1/volumes" Feb 25 16:15:42 crc kubenswrapper[4937]: I0225 16:15:42.280523 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d7981f5-bfaf-41a0-a577-ab25b40dc375","Type":"ContainerStarted","Data":"ee4ca7537af456aa3f88e807faef4eb491563cb69302df891e964158fe68ac46"} Feb 25 16:15:42 crc kubenswrapper[4937]: I0225 16:15:42.280863 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d7981f5-bfaf-41a0-a577-ab25b40dc375","Type":"ContainerStarted","Data":"4dc5e6323f0983e5224a5cc6e9dd22cf2600320dac11c7db2365e6eff8a47680"} Feb 25 16:15:42 crc kubenswrapper[4937]: I0225 16:15:42.322115 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.322085866 podStartE2EDuration="2.322085866s" podCreationTimestamp="2026-02-25 16:15:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:15:42.314309131 +0000 UTC m=+1793.327701031" watchObservedRunningTime="2026-02-25 16:15:42.322085866 +0000 UTC m=+1793.335477776" Feb 25 16:15:43 crc kubenswrapper[4937]: I0225 16:15:43.368116 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:15:43 crc kubenswrapper[4937]: E0225 16:15:43.369118 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:15:44 crc kubenswrapper[4937]: I0225 16:15:44.085065 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 25 16:15:45 crc kubenswrapper[4937]: I0225 16:15:45.680153 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 16:15:45 crc kubenswrapper[4937]: I0225 16:15:45.680591 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 16:15:46 crc kubenswrapper[4937]: I0225 16:15:46.871158 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 16:15:46 crc kubenswrapper[4937]: I0225 16:15:46.871215 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 16:15:47 crc kubenswrapper[4937]: I0225 16:15:47.885656 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e90a8e88-5fc8-48fe-af70-c6f6553d8b62" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.243:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 16:15:47 crc kubenswrapper[4937]: I0225 16:15:47.885726 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e90a8e88-5fc8-48fe-af70-c6f6553d8b62" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.243:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 16:15:49 crc kubenswrapper[4937]: I0225 16:15:49.085189 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 25 16:15:49 crc kubenswrapper[4937]: I0225 16:15:49.136817 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 25 16:15:49 crc kubenswrapper[4937]: I0225 16:15:49.391565 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 25 16:15:50 crc kubenswrapper[4937]: I0225 16:15:50.679461 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 25 16:15:50 crc kubenswrapper[4937]: I0225 16:15:50.679820 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 25 16:15:51 crc kubenswrapper[4937]: I0225 16:15:51.694873 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9d7981f5-bfaf-41a0-a577-ab25b40dc375" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.245:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 16:15:51 crc kubenswrapper[4937]: I0225 16:15:51.694928 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9d7981f5-bfaf-41a0-a577-ab25b40dc375" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.245:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 16:15:53 crc kubenswrapper[4937]: I0225 16:15:53.207982 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r5xz8"] Feb 25 16:15:53 crc kubenswrapper[4937]: I0225 16:15:53.211155 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:15:53 crc kubenswrapper[4937]: I0225 16:15:53.217772 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r5xz8"] Feb 25 16:15:53 crc kubenswrapper[4937]: I0225 16:15:53.318567 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clqxf\" (UniqueName: \"kubernetes.io/projected/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-kube-api-access-clqxf\") pod \"certified-operators-r5xz8\" (UID: \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\") " pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:15:53 crc kubenswrapper[4937]: I0225 16:15:53.318631 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-catalog-content\") pod \"certified-operators-r5xz8\" (UID: \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\") " pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:15:53 crc kubenswrapper[4937]: I0225 16:15:53.318739 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-utilities\") pod \"certified-operators-r5xz8\" (UID: \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\") " pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:15:53 crc kubenswrapper[4937]: I0225 16:15:53.420890 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clqxf\" (UniqueName: \"kubernetes.io/projected/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-kube-api-access-clqxf\") pod \"certified-operators-r5xz8\" (UID: \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\") " pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:15:53 crc kubenswrapper[4937]: I0225 16:15:53.420976 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-catalog-content\") pod \"certified-operators-r5xz8\" (UID: \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\") " pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:15:53 crc kubenswrapper[4937]: I0225 16:15:53.421181 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-utilities\") pod \"certified-operators-r5xz8\" (UID: \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\") " pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:15:53 crc kubenswrapper[4937]: I0225 16:15:53.421928 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-catalog-content\") pod \"certified-operators-r5xz8\" (UID: \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\") " pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:15:53 crc kubenswrapper[4937]: I0225 16:15:53.422514 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-utilities\") pod \"certified-operators-r5xz8\" (UID: \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\") " pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:15:53 crc kubenswrapper[4937]: I0225 16:15:53.450995 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clqxf\" (UniqueName: \"kubernetes.io/projected/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-kube-api-access-clqxf\") pod \"certified-operators-r5xz8\" (UID: \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\") " pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:15:53 crc kubenswrapper[4937]: I0225 16:15:53.548957 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:15:54 crc kubenswrapper[4937]: I0225 16:15:54.115504 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r5xz8"] Feb 25 16:15:54 crc kubenswrapper[4937]: I0225 16:15:54.432132 4937 generic.go:334] "Generic (PLEG): container finished" podID="e5b80aa7-9a93-4c10-84a8-d6d12889d28e" containerID="1cdbbf4008de41aaa4a6ebff22164e5cd940f1a39a6abd791a56173ca61178da" exitCode=0 Feb 25 16:15:54 crc kubenswrapper[4937]: I0225 16:15:54.432182 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5xz8" event={"ID":"e5b80aa7-9a93-4c10-84a8-d6d12889d28e","Type":"ContainerDied","Data":"1cdbbf4008de41aaa4a6ebff22164e5cd940f1a39a6abd791a56173ca61178da"} Feb 25 16:15:54 crc kubenswrapper[4937]: I0225 16:15:54.432211 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5xz8" event={"ID":"e5b80aa7-9a93-4c10-84a8-d6d12889d28e","Type":"ContainerStarted","Data":"3cc4afaca4045f4240fe26acbcb04afa8cc7f5230a4b484afa08c0c3719dafb7"} Feb 25 16:15:56 crc kubenswrapper[4937]: I0225 16:15:56.878191 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 25 16:15:56 crc kubenswrapper[4937]: I0225 16:15:56.879079 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 25 16:15:56 crc kubenswrapper[4937]: I0225 16:15:56.879645 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 25 16:15:56 crc kubenswrapper[4937]: I0225 16:15:56.880119 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 25 16:15:56 crc kubenswrapper[4937]: I0225 16:15:56.890733 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 25 16:15:56 crc kubenswrapper[4937]: I0225 16:15:56.894702 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 25 16:15:57 crc kubenswrapper[4937]: I0225 16:15:57.369031 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:15:57 crc kubenswrapper[4937]: E0225 16:15:57.369390 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.197463 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533936-7gx2j"] Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.201050 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533936-7gx2j" Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.203736 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.203951 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.204405 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.271123 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533936-7gx2j"] Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.293435 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lk9b\" (UniqueName: \"kubernetes.io/projected/3e3f7da7-06ca-4408-a2ff-c890385edcf0-kube-api-access-6lk9b\") pod \"auto-csr-approver-29533936-7gx2j\" (UID: \"3e3f7da7-06ca-4408-a2ff-c890385edcf0\") " pod="openshift-infra/auto-csr-approver-29533936-7gx2j" Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.396152 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lk9b\" (UniqueName: \"kubernetes.io/projected/3e3f7da7-06ca-4408-a2ff-c890385edcf0-kube-api-access-6lk9b\") pod \"auto-csr-approver-29533936-7gx2j\" (UID: \"3e3f7da7-06ca-4408-a2ff-c890385edcf0\") " pod="openshift-infra/auto-csr-approver-29533936-7gx2j" Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.425897 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lk9b\" (UniqueName: \"kubernetes.io/projected/3e3f7da7-06ca-4408-a2ff-c890385edcf0-kube-api-access-6lk9b\") pod \"auto-csr-approver-29533936-7gx2j\" (UID: \"3e3f7da7-06ca-4408-a2ff-c890385edcf0\") " pod="openshift-infra/auto-csr-approver-29533936-7gx2j" Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.511914 4937 generic.go:334] "Generic (PLEG): container finished" podID="e5b80aa7-9a93-4c10-84a8-d6d12889d28e" containerID="df2fe4ef78694f38b225e031c16c88df880fe1419da34db3a6936946f536ec09" exitCode=0 Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.511972 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5xz8" event={"ID":"e5b80aa7-9a93-4c10-84a8-d6d12889d28e","Type":"ContainerDied","Data":"df2fe4ef78694f38b225e031c16c88df880fe1419da34db3a6936946f536ec09"} Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.640578 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533936-7gx2j" Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.686652 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.687309 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 25 16:16:00 crc kubenswrapper[4937]: I0225 16:16:00.694957 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 25 16:16:01 crc kubenswrapper[4937]: I0225 16:16:01.185452 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533936-7gx2j"] Feb 25 16:16:01 crc kubenswrapper[4937]: W0225 16:16:01.235268 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e3f7da7_06ca_4408_a2ff_c890385edcf0.slice/crio-d1c95afc2f414966efa0b475729800b8cea2a2c07271c4a79738cfe159a8655f WatchSource:0}: Error finding container d1c95afc2f414966efa0b475729800b8cea2a2c07271c4a79738cfe159a8655f: Status 404 returned error can't find the container with id d1c95afc2f414966efa0b475729800b8cea2a2c07271c4a79738cfe159a8655f Feb 25 16:16:01 crc kubenswrapper[4937]: I0225 16:16:01.506400 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 25 16:16:01 crc kubenswrapper[4937]: I0225 16:16:01.525835 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533936-7gx2j" event={"ID":"3e3f7da7-06ca-4408-a2ff-c890385edcf0","Type":"ContainerStarted","Data":"d1c95afc2f414966efa0b475729800b8cea2a2c07271c4a79738cfe159a8655f"} Feb 25 16:16:01 crc kubenswrapper[4937]: I0225 16:16:01.554353 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 25 16:16:02 crc kubenswrapper[4937]: I0225 16:16:02.541615 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5xz8" event={"ID":"e5b80aa7-9a93-4c10-84a8-d6d12889d28e","Type":"ContainerStarted","Data":"c914b2f1783e89c419934bb732fb23fd3941ba6cd1066ed4169df2eb8aa2122d"} Feb 25 16:16:02 crc kubenswrapper[4937]: I0225 16:16:02.578613 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r5xz8" podStartSLOduration=2.654696743 podStartE2EDuration="9.578589868s" podCreationTimestamp="2026-02-25 16:15:53 +0000 UTC" firstStartedPulling="2026-02-25 16:15:54.434424063 +0000 UTC m=+1805.447815953" lastFinishedPulling="2026-02-25 16:16:01.358317148 +0000 UTC m=+1812.371709078" observedRunningTime="2026-02-25 16:16:02.566117046 +0000 UTC m=+1813.579508966" watchObservedRunningTime="2026-02-25 16:16:02.578589868 +0000 UTC m=+1813.591981758" Feb 25 16:16:03 crc kubenswrapper[4937]: I0225 16:16:03.549577 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:16:03 crc kubenswrapper[4937]: I0225 16:16:03.549614 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:16:03 crc kubenswrapper[4937]: I0225 16:16:03.553519 4937 generic.go:334] "Generic (PLEG): container finished" podID="3e3f7da7-06ca-4408-a2ff-c890385edcf0" containerID="c11b6b2b3762e7a7abc350a999b42beb55931df1a2229a4f9d7409f744605c7c" exitCode=0 Feb 25 16:16:03 crc kubenswrapper[4937]: I0225 16:16:03.553728 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533936-7gx2j" event={"ID":"3e3f7da7-06ca-4408-a2ff-c890385edcf0","Type":"ContainerDied","Data":"c11b6b2b3762e7a7abc350a999b42beb55931df1a2229a4f9d7409f744605c7c"} Feb 25 16:16:04 crc kubenswrapper[4937]: I0225 16:16:04.602451 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-r5xz8" podUID="e5b80aa7-9a93-4c10-84a8-d6d12889d28e" containerName="registry-server" probeResult="failure" output=< Feb 25 16:16:04 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 16:16:04 crc kubenswrapper[4937]: > Feb 25 16:16:05 crc kubenswrapper[4937]: I0225 16:16:05.020498 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533936-7gx2j" Feb 25 16:16:05 crc kubenswrapper[4937]: I0225 16:16:05.124646 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lk9b\" (UniqueName: \"kubernetes.io/projected/3e3f7da7-06ca-4408-a2ff-c890385edcf0-kube-api-access-6lk9b\") pod \"3e3f7da7-06ca-4408-a2ff-c890385edcf0\" (UID: \"3e3f7da7-06ca-4408-a2ff-c890385edcf0\") " Feb 25 16:16:05 crc kubenswrapper[4937]: I0225 16:16:05.131686 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3f7da7-06ca-4408-a2ff-c890385edcf0-kube-api-access-6lk9b" (OuterVolumeSpecName: "kube-api-access-6lk9b") pod "3e3f7da7-06ca-4408-a2ff-c890385edcf0" (UID: "3e3f7da7-06ca-4408-a2ff-c890385edcf0"). InnerVolumeSpecName "kube-api-access-6lk9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:05 crc kubenswrapper[4937]: I0225 16:16:05.227969 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lk9b\" (UniqueName: \"kubernetes.io/projected/3e3f7da7-06ca-4408-a2ff-c890385edcf0-kube-api-access-6lk9b\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:05 crc kubenswrapper[4937]: I0225 16:16:05.579108 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533936-7gx2j" event={"ID":"3e3f7da7-06ca-4408-a2ff-c890385edcf0","Type":"ContainerDied","Data":"d1c95afc2f414966efa0b475729800b8cea2a2c07271c4a79738cfe159a8655f"} Feb 25 16:16:05 crc kubenswrapper[4937]: I0225 16:16:05.579143 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1c95afc2f414966efa0b475729800b8cea2a2c07271c4a79738cfe159a8655f" Feb 25 16:16:05 crc kubenswrapper[4937]: I0225 16:16:05.579193 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533936-7gx2j" Feb 25 16:16:06 crc kubenswrapper[4937]: I0225 16:16:06.099462 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533930-xjhtn"] Feb 25 16:16:06 crc kubenswrapper[4937]: I0225 16:16:06.109503 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533930-xjhtn"] Feb 25 16:16:07 crc kubenswrapper[4937]: I0225 16:16:07.397891 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8" path="/var/lib/kubelet/pods/005a2e70-fc68-4d15-b3c6-61dcd4f4c1a8/volumes" Feb 25 16:16:08 crc kubenswrapper[4937]: I0225 16:16:08.367641 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:16:08 crc kubenswrapper[4937]: E0225 16:16:08.368171 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:16:13 crc kubenswrapper[4937]: I0225 16:16:13.610313 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:16:13 crc kubenswrapper[4937]: I0225 16:16:13.677922 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:16:13 crc kubenswrapper[4937]: I0225 16:16:13.758410 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r5xz8"] Feb 25 16:16:13 crc kubenswrapper[4937]: I0225 16:16:13.856578 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hmbgl"] Feb 25 16:16:13 crc kubenswrapper[4937]: I0225 16:16:13.856876 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hmbgl" podUID="e9f770fc-fde6-4340-8c2a-a33e619cb169" containerName="registry-server" containerID="cri-o://7f39be550bf2ea50e27e61e024e732b6a9902f5a1cdbeae7babb598137832834" gracePeriod=2 Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.369782 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-b4zhr"] Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.381933 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-b4zhr"] Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.467938 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-6zhwd"] Feb 25 16:16:14 crc kubenswrapper[4937]: E0225 16:16:14.468382 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3f7da7-06ca-4408-a2ff-c890385edcf0" containerName="oc" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.468405 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3f7da7-06ca-4408-a2ff-c890385edcf0" containerName="oc" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.468637 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3f7da7-06ca-4408-a2ff-c890385edcf0" containerName="oc" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.469353 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.471104 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.484783 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-6zhwd"] Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.572211 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-combined-ca-bundle\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.572310 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b8520984-c620-42eb-ae9d-54a40aa32b55-certs\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.572341 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xvv9\" (UniqueName: \"kubernetes.io/projected/b8520984-c620-42eb-ae9d-54a40aa32b55-kube-api-access-7xvv9\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.572405 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-config-data\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.572427 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-scripts\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.683792 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-combined-ca-bundle\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.686267 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b8520984-c620-42eb-ae9d-54a40aa32b55-certs\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.686337 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xvv9\" (UniqueName: \"kubernetes.io/projected/b8520984-c620-42eb-ae9d-54a40aa32b55-kube-api-access-7xvv9\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.686527 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-config-data\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.686580 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-scripts\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.709630 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b8520984-c620-42eb-ae9d-54a40aa32b55-certs\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.709993 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-scripts\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.712464 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-config-data\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.717747 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xvv9\" (UniqueName: \"kubernetes.io/projected/b8520984-c620-42eb-ae9d-54a40aa32b55-kube-api-access-7xvv9\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.722951 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-combined-ca-bundle\") pod \"cloudkitty-db-sync-6zhwd\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.742713 4937 generic.go:334] "Generic (PLEG): container finished" podID="e9f770fc-fde6-4340-8c2a-a33e619cb169" containerID="7f39be550bf2ea50e27e61e024e732b6a9902f5a1cdbeae7babb598137832834" exitCode=0 Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.743159 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmbgl" event={"ID":"e9f770fc-fde6-4340-8c2a-a33e619cb169","Type":"ContainerDied","Data":"7f39be550bf2ea50e27e61e024e732b6a9902f5a1cdbeae7babb598137832834"} Feb 25 16:16:14 crc kubenswrapper[4937]: I0225 16:16:14.796225 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.042527 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.218146 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s962d\" (UniqueName: \"kubernetes.io/projected/e9f770fc-fde6-4340-8c2a-a33e619cb169-kube-api-access-s962d\") pod \"e9f770fc-fde6-4340-8c2a-a33e619cb169\" (UID: \"e9f770fc-fde6-4340-8c2a-a33e619cb169\") " Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.218260 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f770fc-fde6-4340-8c2a-a33e619cb169-utilities\") pod \"e9f770fc-fde6-4340-8c2a-a33e619cb169\" (UID: \"e9f770fc-fde6-4340-8c2a-a33e619cb169\") " Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.218582 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f770fc-fde6-4340-8c2a-a33e619cb169-catalog-content\") pod \"e9f770fc-fde6-4340-8c2a-a33e619cb169\" (UID: \"e9f770fc-fde6-4340-8c2a-a33e619cb169\") " Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.222237 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9f770fc-fde6-4340-8c2a-a33e619cb169-utilities" (OuterVolumeSpecName: "utilities") pod "e9f770fc-fde6-4340-8c2a-a33e619cb169" (UID: "e9f770fc-fde6-4340-8c2a-a33e619cb169"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.252203 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9f770fc-fde6-4340-8c2a-a33e619cb169-kube-api-access-s962d" (OuterVolumeSpecName: "kube-api-access-s962d") pod "e9f770fc-fde6-4340-8c2a-a33e619cb169" (UID: "e9f770fc-fde6-4340-8c2a-a33e619cb169"). InnerVolumeSpecName "kube-api-access-s962d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.320611 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s962d\" (UniqueName: \"kubernetes.io/projected/e9f770fc-fde6-4340-8c2a-a33e619cb169-kube-api-access-s962d\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.320643 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f770fc-fde6-4340-8c2a-a33e619cb169-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.331884 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9f770fc-fde6-4340-8c2a-a33e619cb169-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9f770fc-fde6-4340-8c2a-a33e619cb169" (UID: "e9f770fc-fde6-4340-8c2a-a33e619cb169"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.389901 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44849697-9b41-4439-b8c7-f497036543aa" path="/var/lib/kubelet/pods/44849697-9b41-4439-b8c7-f497036543aa/volumes" Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.392974 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-6zhwd"] Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.422801 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f770fc-fde6-4340-8c2a-a33e619cb169-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.755719 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmbgl" Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.755705 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmbgl" event={"ID":"e9f770fc-fde6-4340-8c2a-a33e619cb169","Type":"ContainerDied","Data":"0f4e3e4c8bf090c3cdf4d476c991afb5f5ea9a000368b7d24f5bc47e07ffc11a"} Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.755853 4937 scope.go:117] "RemoveContainer" containerID="7f39be550bf2ea50e27e61e024e732b6a9902f5a1cdbeae7babb598137832834" Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.757832 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6zhwd" event={"ID":"b8520984-c620-42eb-ae9d-54a40aa32b55","Type":"ContainerStarted","Data":"3eb54dec09e10e34e5e91f19ded59935da04de0f0cd83099dcde5110305af2a1"} Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.789765 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hmbgl"] Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.794845 4937 scope.go:117] "RemoveContainer" containerID="0f13b05073e93fe4f8c0b053a6797be189e7156234226e4c16195b927c05c16a" Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.818205 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hmbgl"] Feb 25 16:16:15 crc kubenswrapper[4937]: I0225 16:16:15.842348 4937 scope.go:117] "RemoveContainer" containerID="933cdd88bf964e57f8f7a1c4e561f04f1c78964d441dbad7eee28a91cf02ba3c" Feb 25 16:16:16 crc kubenswrapper[4937]: I0225 16:16:16.034341 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 16:16:16 crc kubenswrapper[4937]: I0225 16:16:16.775011 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6zhwd" event={"ID":"b8520984-c620-42eb-ae9d-54a40aa32b55","Type":"ContainerStarted","Data":"c004804f9cfdef65508b11f0a69dcc00575b8841b01bbc64b07143b2dfbb8025"} Feb 25 16:16:16 crc kubenswrapper[4937]: I0225 16:16:16.830766 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-6zhwd" podStartSLOduration=2.276091171 podStartE2EDuration="2.830742976s" podCreationTimestamp="2026-02-25 16:16:14 +0000 UTC" firstStartedPulling="2026-02-25 16:16:15.386807178 +0000 UTC m=+1826.400199068" lastFinishedPulling="2026-02-25 16:16:15.941458993 +0000 UTC m=+1826.954850873" observedRunningTime="2026-02-25 16:16:16.799669967 +0000 UTC m=+1827.813061867" watchObservedRunningTime="2026-02-25 16:16:16.830742976 +0000 UTC m=+1827.844134866" Feb 25 16:16:16 crc kubenswrapper[4937]: I0225 16:16:16.841337 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:16:16 crc kubenswrapper[4937]: I0225 16:16:16.841749 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="ceilometer-central-agent" containerID="cri-o://3fd5abba776fd0bb13bae5f9ebc8fa3d982275f3d6e89ad2da44ef74dd961a16" gracePeriod=30 Feb 25 16:16:16 crc kubenswrapper[4937]: I0225 16:16:16.841916 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="proxy-httpd" containerID="cri-o://9ad5a3fa9418ea7024edce8e14f568eb427b36aeb0f2ad9bc6c0b2e921a1646d" gracePeriod=30 Feb 25 16:16:16 crc kubenswrapper[4937]: I0225 16:16:16.841983 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="sg-core" containerID="cri-o://6c74f45829c8dd89e12ee5508f75537b3a7afd7ce0829aa3b069a667403af318" gracePeriod=30 Feb 25 16:16:16 crc kubenswrapper[4937]: I0225 16:16:16.842032 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="ceilometer-notification-agent" containerID="cri-o://a41cbb9ec10ee5a3bf122dd49a0838b035020c14d46d84017bedcabe4b3dbd22" gracePeriod=30 Feb 25 16:16:17 crc kubenswrapper[4937]: I0225 16:16:17.006298 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 16:16:17 crc kubenswrapper[4937]: I0225 16:16:17.382662 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9f770fc-fde6-4340-8c2a-a33e619cb169" path="/var/lib/kubelet/pods/e9f770fc-fde6-4340-8c2a-a33e619cb169/volumes" Feb 25 16:16:17 crc kubenswrapper[4937]: I0225 16:16:17.785397 4937 generic.go:334] "Generic (PLEG): container finished" podID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerID="9ad5a3fa9418ea7024edce8e14f568eb427b36aeb0f2ad9bc6c0b2e921a1646d" exitCode=0 Feb 25 16:16:17 crc kubenswrapper[4937]: I0225 16:16:17.786096 4937 generic.go:334] "Generic (PLEG): container finished" podID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerID="6c74f45829c8dd89e12ee5508f75537b3a7afd7ce0829aa3b069a667403af318" exitCode=2 Feb 25 16:16:17 crc kubenswrapper[4937]: I0225 16:16:17.785473 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c5876dc-2444-4491-8ab8-3360d3f4a84c","Type":"ContainerDied","Data":"9ad5a3fa9418ea7024edce8e14f568eb427b36aeb0f2ad9bc6c0b2e921a1646d"} Feb 25 16:16:17 crc kubenswrapper[4937]: I0225 16:16:17.786215 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c5876dc-2444-4491-8ab8-3360d3f4a84c","Type":"ContainerDied","Data":"6c74f45829c8dd89e12ee5508f75537b3a7afd7ce0829aa3b069a667403af318"} Feb 25 16:16:18 crc kubenswrapper[4937]: I0225 16:16:18.798744 4937 generic.go:334] "Generic (PLEG): container finished" podID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerID="3fd5abba776fd0bb13bae5f9ebc8fa3d982275f3d6e89ad2da44ef74dd961a16" exitCode=0 Feb 25 16:16:18 crc kubenswrapper[4937]: I0225 16:16:18.798833 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c5876dc-2444-4491-8ab8-3360d3f4a84c","Type":"ContainerDied","Data":"3fd5abba776fd0bb13bae5f9ebc8fa3d982275f3d6e89ad2da44ef74dd961a16"} Feb 25 16:16:21 crc kubenswrapper[4937]: I0225 16:16:21.262176 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="b9ebad40-444e-4250-85cb-2a154282cdf9" containerName="rabbitmq" containerID="cri-o://3501318ab23e809f44e4fe03fbb027a573e57d253846d0261206afe1f5c473ba" gracePeriod=604795 Feb 25 16:16:21 crc kubenswrapper[4937]: I0225 16:16:21.843570 4937 generic.go:334] "Generic (PLEG): container finished" podID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerID="a41cbb9ec10ee5a3bf122dd49a0838b035020c14d46d84017bedcabe4b3dbd22" exitCode=0 Feb 25 16:16:21 crc kubenswrapper[4937]: I0225 16:16:21.843643 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c5876dc-2444-4491-8ab8-3360d3f4a84c","Type":"ContainerDied","Data":"a41cbb9ec10ee5a3bf122dd49a0838b035020c14d46d84017bedcabe4b3dbd22"} Feb 25 16:16:21 crc kubenswrapper[4937]: I0225 16:16:21.846313 4937 generic.go:334] "Generic (PLEG): container finished" podID="b8520984-c620-42eb-ae9d-54a40aa32b55" containerID="c004804f9cfdef65508b11f0a69dcc00575b8841b01bbc64b07143b2dfbb8025" exitCode=0 Feb 25 16:16:21 crc kubenswrapper[4937]: I0225 16:16:21.846346 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6zhwd" event={"ID":"b8520984-c620-42eb-ae9d-54a40aa32b55","Type":"ContainerDied","Data":"c004804f9cfdef65508b11f0a69dcc00575b8841b01bbc64b07143b2dfbb8025"} Feb 25 16:16:21 crc kubenswrapper[4937]: I0225 16:16:21.886240 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="de5b4144-33d4-4860-9872-8826c78490a7" containerName="rabbitmq" containerID="cri-o://62a80333ddfc88f47488d577f59b7296f624b0810fae1400d4e957b9531f0159" gracePeriod=604796 Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.085691 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.262137 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-sg-core-conf-yaml\") pod \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.263009 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c5876dc-2444-4491-8ab8-3360d3f4a84c-run-httpd\") pod \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.263067 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-combined-ca-bundle\") pod \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.263329 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c5876dc-2444-4491-8ab8-3360d3f4a84c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1c5876dc-2444-4491-8ab8-3360d3f4a84c" (UID: "1c5876dc-2444-4491-8ab8-3360d3f4a84c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.263370 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c5876dc-2444-4491-8ab8-3360d3f4a84c-log-httpd\") pod \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.263475 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-scripts\") pod \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.263574 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5xnd\" (UniqueName: \"kubernetes.io/projected/1c5876dc-2444-4491-8ab8-3360d3f4a84c-kube-api-access-w5xnd\") pod \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.263627 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-config-data\") pod \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.263652 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-ceilometer-tls-certs\") pod \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\" (UID: \"1c5876dc-2444-4491-8ab8-3360d3f4a84c\") " Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.263769 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c5876dc-2444-4491-8ab8-3360d3f4a84c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1c5876dc-2444-4491-8ab8-3360d3f4a84c" (UID: "1c5876dc-2444-4491-8ab8-3360d3f4a84c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.264156 4937 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c5876dc-2444-4491-8ab8-3360d3f4a84c-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.264167 4937 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c5876dc-2444-4491-8ab8-3360d3f4a84c-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.268397 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c5876dc-2444-4491-8ab8-3360d3f4a84c-kube-api-access-w5xnd" (OuterVolumeSpecName: "kube-api-access-w5xnd") pod "1c5876dc-2444-4491-8ab8-3360d3f4a84c" (UID: "1c5876dc-2444-4491-8ab8-3360d3f4a84c"). InnerVolumeSpecName "kube-api-access-w5xnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.269151 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-scripts" (OuterVolumeSpecName: "scripts") pod "1c5876dc-2444-4491-8ab8-3360d3f4a84c" (UID: "1c5876dc-2444-4491-8ab8-3360d3f4a84c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.299902 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1c5876dc-2444-4491-8ab8-3360d3f4a84c" (UID: "1c5876dc-2444-4491-8ab8-3360d3f4a84c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.334577 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1c5876dc-2444-4491-8ab8-3360d3f4a84c" (UID: "1c5876dc-2444-4491-8ab8-3360d3f4a84c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.351896 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c5876dc-2444-4491-8ab8-3360d3f4a84c" (UID: "1c5876dc-2444-4491-8ab8-3360d3f4a84c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.366098 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.366127 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5xnd\" (UniqueName: \"kubernetes.io/projected/1c5876dc-2444-4491-8ab8-3360d3f4a84c-kube-api-access-w5xnd\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.366139 4937 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.366146 4937 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.366155 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.384018 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-config-data" (OuterVolumeSpecName: "config-data") pod "1c5876dc-2444-4491-8ab8-3360d3f4a84c" (UID: "1c5876dc-2444-4491-8ab8-3360d3f4a84c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.467861 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c5876dc-2444-4491-8ab8-3360d3f4a84c-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.883129 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.883828 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c5876dc-2444-4491-8ab8-3360d3f4a84c","Type":"ContainerDied","Data":"38fb3c544f77c94d0c9f8954f6738caab19bc5e19992e945b72657058f7bf42d"} Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.883906 4937 scope.go:117] "RemoveContainer" containerID="9ad5a3fa9418ea7024edce8e14f568eb427b36aeb0f2ad9bc6c0b2e921a1646d" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.918243 4937 scope.go:117] "RemoveContainer" containerID="6c74f45829c8dd89e12ee5508f75537b3a7afd7ce0829aa3b069a667403af318" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.930402 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.942703 4937 scope.go:117] "RemoveContainer" containerID="a41cbb9ec10ee5a3bf122dd49a0838b035020c14d46d84017bedcabe4b3dbd22" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.947961 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.966905 4937 scope.go:117] "RemoveContainer" containerID="3fd5abba776fd0bb13bae5f9ebc8fa3d982275f3d6e89ad2da44ef74dd961a16" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.977881 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:16:22 crc kubenswrapper[4937]: E0225 16:16:22.978351 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f770fc-fde6-4340-8c2a-a33e619cb169" containerName="extract-content" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.978367 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f770fc-fde6-4340-8c2a-a33e619cb169" containerName="extract-content" Feb 25 16:16:22 crc kubenswrapper[4937]: E0225 16:16:22.978377 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="sg-core" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.978383 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="sg-core" Feb 25 16:16:22 crc kubenswrapper[4937]: E0225 16:16:22.978407 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f770fc-fde6-4340-8c2a-a33e619cb169" containerName="registry-server" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.978414 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f770fc-fde6-4340-8c2a-a33e619cb169" containerName="registry-server" Feb 25 16:16:22 crc kubenswrapper[4937]: E0225 16:16:22.978421 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="proxy-httpd" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.978426 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="proxy-httpd" Feb 25 16:16:22 crc kubenswrapper[4937]: E0225 16:16:22.978447 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="ceilometer-central-agent" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.978453 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="ceilometer-central-agent" Feb 25 16:16:22 crc kubenswrapper[4937]: E0225 16:16:22.978459 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="ceilometer-notification-agent" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.978466 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="ceilometer-notification-agent" Feb 25 16:16:22 crc kubenswrapper[4937]: E0225 16:16:22.978497 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f770fc-fde6-4340-8c2a-a33e619cb169" containerName="extract-utilities" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.978503 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f770fc-fde6-4340-8c2a-a33e619cb169" containerName="extract-utilities" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.978753 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="ceilometer-notification-agent" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.978782 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="proxy-httpd" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.978801 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="ceilometer-central-agent" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.978817 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" containerName="sg-core" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.978833 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9f770fc-fde6-4340-8c2a-a33e619cb169" containerName="registry-server" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.981291 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.983930 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.984269 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.985350 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.989131 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="b9ebad40-444e-4250-85cb-2a154282cdf9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Feb 25 16:16:22 crc kubenswrapper[4937]: I0225 16:16:22.989543 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.080568 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.080642 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmbx7\" (UniqueName: \"kubernetes.io/projected/4d9d51be-46d2-4d06-8f81-f34e8693e52d-kube-api-access-pmbx7\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.080808 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d9d51be-46d2-4d06-8f81-f34e8693e52d-log-httpd\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.080925 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d9d51be-46d2-4d06-8f81-f34e8693e52d-run-httpd\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.080955 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-scripts\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.080972 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-config-data\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.081032 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.081048 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.183530 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-config-data\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.183640 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.183665 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.184050 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.184720 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmbx7\" (UniqueName: \"kubernetes.io/projected/4d9d51be-46d2-4d06-8f81-f34e8693e52d-kube-api-access-pmbx7\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.184869 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d9d51be-46d2-4d06-8f81-f34e8693e52d-log-httpd\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.185044 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d9d51be-46d2-4d06-8f81-f34e8693e52d-run-httpd\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.185114 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-scripts\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.186528 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d9d51be-46d2-4d06-8f81-f34e8693e52d-log-httpd\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.186773 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d9d51be-46d2-4d06-8f81-f34e8693e52d-run-httpd\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.190634 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.190675 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.192231 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-scripts\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.192548 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.202891 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d9d51be-46d2-4d06-8f81-f34e8693e52d-config-data\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.206115 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmbx7\" (UniqueName: \"kubernetes.io/projected/4d9d51be-46d2-4d06-8f81-f34e8693e52d-kube-api-access-pmbx7\") pod \"ceilometer-0\" (UID: \"4d9d51be-46d2-4d06-8f81-f34e8693e52d\") " pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.311325 4937 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="de5b4144-33d4-4860-9872-8826c78490a7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.312100 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.368360 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:16:23 crc kubenswrapper[4937]: E0225 16:16:23.368962 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.385370 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c5876dc-2444-4491-8ab8-3360d3f4a84c" path="/var/lib/kubelet/pods/1c5876dc-2444-4491-8ab8-3360d3f4a84c/volumes" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.465303 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.596090 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b8520984-c620-42eb-ae9d-54a40aa32b55-certs\") pod \"b8520984-c620-42eb-ae9d-54a40aa32b55\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.596367 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-config-data\") pod \"b8520984-c620-42eb-ae9d-54a40aa32b55\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.596443 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xvv9\" (UniqueName: \"kubernetes.io/projected/b8520984-c620-42eb-ae9d-54a40aa32b55-kube-api-access-7xvv9\") pod \"b8520984-c620-42eb-ae9d-54a40aa32b55\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.596519 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-combined-ca-bundle\") pod \"b8520984-c620-42eb-ae9d-54a40aa32b55\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.596542 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-scripts\") pod \"b8520984-c620-42eb-ae9d-54a40aa32b55\" (UID: \"b8520984-c620-42eb-ae9d-54a40aa32b55\") " Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.605694 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8520984-c620-42eb-ae9d-54a40aa32b55-certs" (OuterVolumeSpecName: "certs") pod "b8520984-c620-42eb-ae9d-54a40aa32b55" (UID: "b8520984-c620-42eb-ae9d-54a40aa32b55"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.611651 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-scripts" (OuterVolumeSpecName: "scripts") pod "b8520984-c620-42eb-ae9d-54a40aa32b55" (UID: "b8520984-c620-42eb-ae9d-54a40aa32b55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.625128 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8520984-c620-42eb-ae9d-54a40aa32b55-kube-api-access-7xvv9" (OuterVolumeSpecName: "kube-api-access-7xvv9") pod "b8520984-c620-42eb-ae9d-54a40aa32b55" (UID: "b8520984-c620-42eb-ae9d-54a40aa32b55"). InnerVolumeSpecName "kube-api-access-7xvv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.636071 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-config-data" (OuterVolumeSpecName: "config-data") pod "b8520984-c620-42eb-ae9d-54a40aa32b55" (UID: "b8520984-c620-42eb-ae9d-54a40aa32b55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.654435 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8520984-c620-42eb-ae9d-54a40aa32b55" (UID: "b8520984-c620-42eb-ae9d-54a40aa32b55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.699602 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xvv9\" (UniqueName: \"kubernetes.io/projected/b8520984-c620-42eb-ae9d-54a40aa32b55-kube-api-access-7xvv9\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.699895 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.699907 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.699917 4937 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b8520984-c620-42eb-ae9d-54a40aa32b55-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.699926 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8520984-c620-42eb-ae9d-54a40aa32b55-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.796880 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.903875 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d9d51be-46d2-4d06-8f81-f34e8693e52d","Type":"ContainerStarted","Data":"a694634688a7a00e4b8e20986527de2e9d9b76c0e806ffe5be32c24e8730dcd6"} Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.912251 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6zhwd" event={"ID":"b8520984-c620-42eb-ae9d-54a40aa32b55","Type":"ContainerDied","Data":"3eb54dec09e10e34e5e91f19ded59935da04de0f0cd83099dcde5110305af2a1"} Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.912309 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eb54dec09e10e34e5e91f19ded59935da04de0f0cd83099dcde5110305af2a1" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.912415 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6zhwd" Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.937173 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-qb2mq"] Feb 25 16:16:23 crc kubenswrapper[4937]: I0225 16:16:23.948570 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-qb2mq"] Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.043951 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-bfq7k"] Feb 25 16:16:24 crc kubenswrapper[4937]: E0225 16:16:24.044539 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8520984-c620-42eb-ae9d-54a40aa32b55" containerName="cloudkitty-db-sync" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.044565 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8520984-c620-42eb-ae9d-54a40aa32b55" containerName="cloudkitty-db-sync" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.044831 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8520984-c620-42eb-ae9d-54a40aa32b55" containerName="cloudkitty-db-sync" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.045782 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.047953 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.056628 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-bfq7k"] Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.209230 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-config-data\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.209322 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vxwh\" (UniqueName: \"kubernetes.io/projected/8ee73347-7944-41ba-b1ee-26c8c95f3be6-kube-api-access-2vxwh\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.209375 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-scripts\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.209690 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ee73347-7944-41ba-b1ee-26c8c95f3be6-certs\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.209836 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-combined-ca-bundle\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.312555 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-config-data\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.312855 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vxwh\" (UniqueName: \"kubernetes.io/projected/8ee73347-7944-41ba-b1ee-26c8c95f3be6-kube-api-access-2vxwh\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.312903 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-scripts\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.312986 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ee73347-7944-41ba-b1ee-26c8c95f3be6-certs\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.313042 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-combined-ca-bundle\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.317258 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-scripts\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.318078 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-config-data\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.319106 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-combined-ca-bundle\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.319950 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ee73347-7944-41ba-b1ee-26c8c95f3be6-certs\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.341134 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vxwh\" (UniqueName: \"kubernetes.io/projected/8ee73347-7944-41ba-b1ee-26c8c95f3be6-kube-api-access-2vxwh\") pod \"cloudkitty-storageinit-bfq7k\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.370535 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.896781 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-bfq7k"] Feb 25 16:16:24 crc kubenswrapper[4937]: I0225 16:16:24.956961 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-bfq7k" event={"ID":"8ee73347-7944-41ba-b1ee-26c8c95f3be6","Type":"ContainerStarted","Data":"00eb286145f42005943bce04a146b5ea538752f27ee659e56a2f301158729ff1"} Feb 25 16:16:25 crc kubenswrapper[4937]: I0225 16:16:25.381572 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d12dac4-3aaf-41e5-aff8-68749f020d89" path="/var/lib/kubelet/pods/4d12dac4-3aaf-41e5-aff8-68749f020d89/volumes" Feb 25 16:16:25 crc kubenswrapper[4937]: I0225 16:16:25.968319 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-bfq7k" event={"ID":"8ee73347-7944-41ba-b1ee-26c8c95f3be6","Type":"ContainerStarted","Data":"10d3a46390a867296e3caaeb51b08e16c958248b99eadff95dd7ece58428663b"} Feb 25 16:16:25 crc kubenswrapper[4937]: I0225 16:16:25.991631 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-bfq7k" podStartSLOduration=1.99161035 podStartE2EDuration="1.99161035s" podCreationTimestamp="2026-02-25 16:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:16:25.983340802 +0000 UTC m=+1836.996732692" watchObservedRunningTime="2026-02-25 16:16:25.99161035 +0000 UTC m=+1837.005002230" Feb 25 16:16:26 crc kubenswrapper[4937]: I0225 16:16:26.977930 4937 generic.go:334] "Generic (PLEG): container finished" podID="8ee73347-7944-41ba-b1ee-26c8c95f3be6" containerID="10d3a46390a867296e3caaeb51b08e16c958248b99eadff95dd7ece58428663b" exitCode=0 Feb 25 16:16:26 crc kubenswrapper[4937]: I0225 16:16:26.978031 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-bfq7k" event={"ID":"8ee73347-7944-41ba-b1ee-26c8c95f3be6","Type":"ContainerDied","Data":"10d3a46390a867296e3caaeb51b08e16c958248b99eadff95dd7ece58428663b"} Feb 25 16:16:27 crc kubenswrapper[4937]: I0225 16:16:27.993426 4937 generic.go:334] "Generic (PLEG): container finished" podID="b9ebad40-444e-4250-85cb-2a154282cdf9" containerID="3501318ab23e809f44e4fe03fbb027a573e57d253846d0261206afe1f5c473ba" exitCode=0 Feb 25 16:16:27 crc kubenswrapper[4937]: I0225 16:16:27.993520 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9ebad40-444e-4250-85cb-2a154282cdf9","Type":"ContainerDied","Data":"3501318ab23e809f44e4fe03fbb027a573e57d253846d0261206afe1f5c473ba"} Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.278862 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.416147 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-server-conf\") pod \"b9ebad40-444e-4250-85cb-2a154282cdf9\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.416208 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-tls\") pod \"b9ebad40-444e-4250-85cb-2a154282cdf9\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.416271 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-plugins\") pod \"b9ebad40-444e-4250-85cb-2a154282cdf9\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.416354 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-confd\") pod \"b9ebad40-444e-4250-85cb-2a154282cdf9\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.416398 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6895z\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-kube-api-access-6895z\") pod \"b9ebad40-444e-4250-85cb-2a154282cdf9\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.416425 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-config-data\") pod \"b9ebad40-444e-4250-85cb-2a154282cdf9\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.416519 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9ebad40-444e-4250-85cb-2a154282cdf9-pod-info\") pod \"b9ebad40-444e-4250-85cb-2a154282cdf9\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.416554 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-erlang-cookie\") pod \"b9ebad40-444e-4250-85cb-2a154282cdf9\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.416623 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9ebad40-444e-4250-85cb-2a154282cdf9-erlang-cookie-secret\") pod \"b9ebad40-444e-4250-85cb-2a154282cdf9\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.421198 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b9ebad40-444e-4250-85cb-2a154282cdf9" (UID: "b9ebad40-444e-4250-85cb-2a154282cdf9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.421617 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b9ebad40-444e-4250-85cb-2a154282cdf9" (UID: "b9ebad40-444e-4250-85cb-2a154282cdf9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.424779 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\") pod \"b9ebad40-444e-4250-85cb-2a154282cdf9\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.424876 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-plugins-conf\") pod \"b9ebad40-444e-4250-85cb-2a154282cdf9\" (UID: \"b9ebad40-444e-4250-85cb-2a154282cdf9\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.425890 4937 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.425918 4937 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.436159 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ebad40-444e-4250-85cb-2a154282cdf9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b9ebad40-444e-4250-85cb-2a154282cdf9" (UID: "b9ebad40-444e-4250-85cb-2a154282cdf9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.441137 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-kube-api-access-6895z" (OuterVolumeSpecName: "kube-api-access-6895z") pod "b9ebad40-444e-4250-85cb-2a154282cdf9" (UID: "b9ebad40-444e-4250-85cb-2a154282cdf9"). InnerVolumeSpecName "kube-api-access-6895z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.446915 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b9ebad40-444e-4250-85cb-2a154282cdf9" (UID: "b9ebad40-444e-4250-85cb-2a154282cdf9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.456371 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b9ebad40-444e-4250-85cb-2a154282cdf9-pod-info" (OuterVolumeSpecName: "pod-info") pod "b9ebad40-444e-4250-85cb-2a154282cdf9" (UID: "b9ebad40-444e-4250-85cb-2a154282cdf9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.460080 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b9ebad40-444e-4250-85cb-2a154282cdf9" (UID: "b9ebad40-444e-4250-85cb-2a154282cdf9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.487891 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757" (OuterVolumeSpecName: "persistence") pod "b9ebad40-444e-4250-85cb-2a154282cdf9" (UID: "b9ebad40-444e-4250-85cb-2a154282cdf9"). InnerVolumeSpecName "pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.518648 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-config-data" (OuterVolumeSpecName: "config-data") pod "b9ebad40-444e-4250-85cb-2a154282cdf9" (UID: "b9ebad40-444e-4250-85cb-2a154282cdf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.533395 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6895z\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-kube-api-access-6895z\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.533429 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.533438 4937 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9ebad40-444e-4250-85cb-2a154282cdf9-pod-info\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.533447 4937 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9ebad40-444e-4250-85cb-2a154282cdf9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.533469 4937 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\") on node \"crc\" " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.533479 4937 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.533505 4937 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.551019 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-server-conf" (OuterVolumeSpecName: "server-conf") pod "b9ebad40-444e-4250-85cb-2a154282cdf9" (UID: "b9ebad40-444e-4250-85cb-2a154282cdf9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.594450 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.636469 4937 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9ebad40-444e-4250-85cb-2a154282cdf9-server-conf\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.646300 4937 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.647008 4937 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757") on node "crc" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.698311 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b9ebad40-444e-4250-85cb-2a154282cdf9" (UID: "b9ebad40-444e-4250-85cb-2a154282cdf9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.701888 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.737818 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vxwh\" (UniqueName: \"kubernetes.io/projected/8ee73347-7944-41ba-b1ee-26c8c95f3be6-kube-api-access-2vxwh\") pod \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.737892 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-scripts\") pod \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.737965 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-combined-ca-bundle\") pod \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.738014 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ee73347-7944-41ba-b1ee-26c8c95f3be6-certs\") pod \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.738501 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-config-data\") pod \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\" (UID: \"8ee73347-7944-41ba-b1ee-26c8c95f3be6\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.739236 4937 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9ebad40-444e-4250-85cb-2a154282cdf9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.739262 4937 reconciler_common.go:293] "Volume detached for volume \"pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.796195 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-scripts" (OuterVolumeSpecName: "scripts") pod "8ee73347-7944-41ba-b1ee-26c8c95f3be6" (UID: "8ee73347-7944-41ba-b1ee-26c8c95f3be6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.799524 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee73347-7944-41ba-b1ee-26c8c95f3be6-certs" (OuterVolumeSpecName: "certs") pod "8ee73347-7944-41ba-b1ee-26c8c95f3be6" (UID: "8ee73347-7944-41ba-b1ee-26c8c95f3be6"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.799593 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee73347-7944-41ba-b1ee-26c8c95f3be6-kube-api-access-2vxwh" (OuterVolumeSpecName: "kube-api-access-2vxwh") pod "8ee73347-7944-41ba-b1ee-26c8c95f3be6" (UID: "8ee73347-7944-41ba-b1ee-26c8c95f3be6"). InnerVolumeSpecName "kube-api-access-2vxwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.815708 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-config-data" (OuterVolumeSpecName: "config-data") pod "8ee73347-7944-41ba-b1ee-26c8c95f3be6" (UID: "8ee73347-7944-41ba-b1ee-26c8c95f3be6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.822680 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ee73347-7944-41ba-b1ee-26c8c95f3be6" (UID: "8ee73347-7944-41ba-b1ee-26c8c95f3be6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.840293 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-erlang-cookie\") pod \"de5b4144-33d4-4860-9872-8826c78490a7\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.840367 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de5b4144-33d4-4860-9872-8826c78490a7-erlang-cookie-secret\") pod \"de5b4144-33d4-4860-9872-8826c78490a7\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.840404 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-plugins-conf\") pod \"de5b4144-33d4-4860-9872-8826c78490a7\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.840683 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-tls\") pod \"de5b4144-33d4-4860-9872-8826c78490a7\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.840775 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5fz6\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-kube-api-access-n5fz6\") pod \"de5b4144-33d4-4860-9872-8826c78490a7\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.842018 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47931ca-1102-49a2-a86d-b68c8818831a\") pod \"de5b4144-33d4-4860-9872-8826c78490a7\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.842115 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-config-data\") pod \"de5b4144-33d4-4860-9872-8826c78490a7\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.842144 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-confd\") pod \"de5b4144-33d4-4860-9872-8826c78490a7\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.842219 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de5b4144-33d4-4860-9872-8826c78490a7-pod-info\") pod \"de5b4144-33d4-4860-9872-8826c78490a7\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.842255 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-plugins\") pod \"de5b4144-33d4-4860-9872-8826c78490a7\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.842317 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-server-conf\") pod \"de5b4144-33d4-4860-9872-8826c78490a7\" (UID: \"de5b4144-33d4-4860-9872-8826c78490a7\") " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.842914 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vxwh\" (UniqueName: \"kubernetes.io/projected/8ee73347-7944-41ba-b1ee-26c8c95f3be6-kube-api-access-2vxwh\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.842939 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.842955 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.842965 4937 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ee73347-7944-41ba-b1ee-26c8c95f3be6-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.842977 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ee73347-7944-41ba-b1ee-26c8c95f3be6-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.849756 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "de5b4144-33d4-4860-9872-8826c78490a7" (UID: "de5b4144-33d4-4860-9872-8826c78490a7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.850151 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "de5b4144-33d4-4860-9872-8826c78490a7" (UID: "de5b4144-33d4-4860-9872-8826c78490a7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.850377 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5b4144-33d4-4860-9872-8826c78490a7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "de5b4144-33d4-4860-9872-8826c78490a7" (UID: "de5b4144-33d4-4860-9872-8826c78490a7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.850674 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "de5b4144-33d4-4860-9872-8826c78490a7" (UID: "de5b4144-33d4-4860-9872-8826c78490a7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.874697 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/de5b4144-33d4-4860-9872-8826c78490a7-pod-info" (OuterVolumeSpecName: "pod-info") pod "de5b4144-33d4-4860-9872-8826c78490a7" (UID: "de5b4144-33d4-4860-9872-8826c78490a7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.875227 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "de5b4144-33d4-4860-9872-8826c78490a7" (UID: "de5b4144-33d4-4860-9872-8826c78490a7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.877677 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-kube-api-access-n5fz6" (OuterVolumeSpecName: "kube-api-access-n5fz6") pod "de5b4144-33d4-4860-9872-8826c78490a7" (UID: "de5b4144-33d4-4860-9872-8826c78490a7"). InnerVolumeSpecName "kube-api-access-n5fz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.922359 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47931ca-1102-49a2-a86d-b68c8818831a" (OuterVolumeSpecName: "persistence") pod "de5b4144-33d4-4860-9872-8826c78490a7" (UID: "de5b4144-33d4-4860-9872-8826c78490a7"). InnerVolumeSpecName "pvc-b47931ca-1102-49a2-a86d-b68c8818831a". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.930247 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-config-data" (OuterVolumeSpecName: "config-data") pod "de5b4144-33d4-4860-9872-8826c78490a7" (UID: "de5b4144-33d4-4860-9872-8826c78490a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.957272 4937 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.957340 4937 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de5b4144-33d4-4860-9872-8826c78490a7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.957354 4937 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.957364 4937 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.957374 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5fz6\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-kube-api-access-n5fz6\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.957409 4937 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b47931ca-1102-49a2-a86d-b68c8818831a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47931ca-1102-49a2-a86d-b68c8818831a\") on node \"crc\" " Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.957422 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.957436 4937 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de5b4144-33d4-4860-9872-8826c78490a7-pod-info\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.957446 4937 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:28 crc kubenswrapper[4937]: I0225 16:16:28.978161 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-server-conf" (OuterVolumeSpecName: "server-conf") pod "de5b4144-33d4-4860-9872-8826c78490a7" (UID: "de5b4144-33d4-4860-9872-8826c78490a7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.016155 4937 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.016331 4937 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b47931ca-1102-49a2-a86d-b68c8818831a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47931ca-1102-49a2-a86d-b68c8818831a") on node "crc" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.053960 4937 generic.go:334] "Generic (PLEG): container finished" podID="de5b4144-33d4-4860-9872-8826c78490a7" containerID="62a80333ddfc88f47488d577f59b7296f624b0810fae1400d4e957b9531f0159" exitCode=0 Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.054074 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"de5b4144-33d4-4860-9872-8826c78490a7","Type":"ContainerDied","Data":"62a80333ddfc88f47488d577f59b7296f624b0810fae1400d4e957b9531f0159"} Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.054101 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"de5b4144-33d4-4860-9872-8826c78490a7","Type":"ContainerDied","Data":"bb534c50e734e38b97ebbded5f5e0bc8937f316e5238e2574bd2ccbcee2d5a97"} Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.054117 4937 scope.go:117] "RemoveContainer" containerID="62a80333ddfc88f47488d577f59b7296f624b0810fae1400d4e957b9531f0159" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.054272 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.061843 4937 reconciler_common.go:293] "Volume detached for volume \"pvc-b47931ca-1102-49a2-a86d-b68c8818831a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47931ca-1102-49a2-a86d-b68c8818831a\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.061876 4937 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de5b4144-33d4-4860-9872-8826c78490a7-server-conf\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.071869 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "de5b4144-33d4-4860-9872-8826c78490a7" (UID: "de5b4144-33d4-4860-9872-8826c78490a7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.099747 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9ebad40-444e-4250-85cb-2a154282cdf9","Type":"ContainerDied","Data":"394ef23ce490ab50b87b6c8cd5665568dc62a47704fc244bd5c73b58d063cc3a"} Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.099773 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.123673 4937 scope.go:117] "RemoveContainer" containerID="82dee4b670df39dc191f5c519f9747dc8a2893b9682ea4e53c75a02199de7f0c" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.144850 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d9d51be-46d2-4d06-8f81-f34e8693e52d","Type":"ContainerStarted","Data":"09611e0f4500b1a1068f3a393cb78798c2749d4ec286636bc565731d0eb6cb85"} Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.144895 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d9d51be-46d2-4d06-8f81-f34e8693e52d","Type":"ContainerStarted","Data":"1e031f9dddf66a0913220f6edbd7a82c10a447ff74607515f4139b26570fad05"} Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.158800 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-bfq7k" event={"ID":"8ee73347-7944-41ba-b1ee-26c8c95f3be6","Type":"ContainerDied","Data":"00eb286145f42005943bce04a146b5ea538752f27ee659e56a2f301158729ff1"} Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.158850 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00eb286145f42005943bce04a146b5ea538752f27ee659e56a2f301158729ff1" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.158951 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-bfq7k" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.163344 4937 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de5b4144-33d4-4860-9872-8826c78490a7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.166074 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.166331 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="1aaa1053-5b44-458d-aa42-a9804344d2e3" containerName="cloudkitty-proc" containerID="cri-o://312d0b6618c2c81c82d5415d1326e0f815b0f6ac2bb96de77fd0afd3090e1701" gracePeriod=30 Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.184982 4937 scope.go:117] "RemoveContainer" containerID="62a80333ddfc88f47488d577f59b7296f624b0810fae1400d4e957b9531f0159" Feb 25 16:16:29 crc kubenswrapper[4937]: E0225 16:16:29.185460 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62a80333ddfc88f47488d577f59b7296f624b0810fae1400d4e957b9531f0159\": container with ID starting with 62a80333ddfc88f47488d577f59b7296f624b0810fae1400d4e957b9531f0159 not found: ID does not exist" containerID="62a80333ddfc88f47488d577f59b7296f624b0810fae1400d4e957b9531f0159" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.185506 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62a80333ddfc88f47488d577f59b7296f624b0810fae1400d4e957b9531f0159"} err="failed to get container status \"62a80333ddfc88f47488d577f59b7296f624b0810fae1400d4e957b9531f0159\": rpc error: code = NotFound desc = could not find container \"62a80333ddfc88f47488d577f59b7296f624b0810fae1400d4e957b9531f0159\": container with ID starting with 62a80333ddfc88f47488d577f59b7296f624b0810fae1400d4e957b9531f0159 not found: ID does not exist" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.185532 4937 scope.go:117] "RemoveContainer" containerID="82dee4b670df39dc191f5c519f9747dc8a2893b9682ea4e53c75a02199de7f0c" Feb 25 16:16:29 crc kubenswrapper[4937]: E0225 16:16:29.185742 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82dee4b670df39dc191f5c519f9747dc8a2893b9682ea4e53c75a02199de7f0c\": container with ID starting with 82dee4b670df39dc191f5c519f9747dc8a2893b9682ea4e53c75a02199de7f0c not found: ID does not exist" containerID="82dee4b670df39dc191f5c519f9747dc8a2893b9682ea4e53c75a02199de7f0c" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.185763 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82dee4b670df39dc191f5c519f9747dc8a2893b9682ea4e53c75a02199de7f0c"} err="failed to get container status \"82dee4b670df39dc191f5c519f9747dc8a2893b9682ea4e53c75a02199de7f0c\": rpc error: code = NotFound desc = could not find container \"82dee4b670df39dc191f5c519f9747dc8a2893b9682ea4e53c75a02199de7f0c\": container with ID starting with 82dee4b670df39dc191f5c519f9747dc8a2893b9682ea4e53c75a02199de7f0c not found: ID does not exist" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.185779 4937 scope.go:117] "RemoveContainer" containerID="3501318ab23e809f44e4fe03fbb027a573e57d253846d0261206afe1f5c473ba" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.194868 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.195171 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="2577b339-c9c0-4e63-afa1-c0b2fb7177b4" containerName="cloudkitty-api-log" containerID="cri-o://40453a9b19a803b2d1da79a4068132eda37db5e9c992255359e0e584ea855f8c" gracePeriod=30 Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.195432 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="2577b339-c9c0-4e63-afa1-c0b2fb7177b4" containerName="cloudkitty-api" containerID="cri-o://73834f0d72d100f7acfc9c3e1aaa128281b7a8d3463254ff25e22c106583027d" gracePeriod=30 Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.233631 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.247743 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.255890 4937 scope.go:117] "RemoveContainer" containerID="7d15cf71941dacdd51d4d3f984cb980362aba44baf3e3b14e00f057c6dd681fc" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.262982 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 16:16:29 crc kubenswrapper[4937]: E0225 16:16:29.263376 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee73347-7944-41ba-b1ee-26c8c95f3be6" containerName="cloudkitty-storageinit" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.263391 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee73347-7944-41ba-b1ee-26c8c95f3be6" containerName="cloudkitty-storageinit" Feb 25 16:16:29 crc kubenswrapper[4937]: E0225 16:16:29.263403 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5b4144-33d4-4860-9872-8826c78490a7" containerName="rabbitmq" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.263409 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5b4144-33d4-4860-9872-8826c78490a7" containerName="rabbitmq" Feb 25 16:16:29 crc kubenswrapper[4937]: E0225 16:16:29.263427 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ebad40-444e-4250-85cb-2a154282cdf9" containerName="rabbitmq" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.263433 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ebad40-444e-4250-85cb-2a154282cdf9" containerName="rabbitmq" Feb 25 16:16:29 crc kubenswrapper[4937]: E0225 16:16:29.263452 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ebad40-444e-4250-85cb-2a154282cdf9" containerName="setup-container" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.263458 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ebad40-444e-4250-85cb-2a154282cdf9" containerName="setup-container" Feb 25 16:16:29 crc kubenswrapper[4937]: E0225 16:16:29.263465 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5b4144-33d4-4860-9872-8826c78490a7" containerName="setup-container" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.263634 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5b4144-33d4-4860-9872-8826c78490a7" containerName="setup-container" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.263828 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="de5b4144-33d4-4860-9872-8826c78490a7" containerName="rabbitmq" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.263851 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ebad40-444e-4250-85cb-2a154282cdf9" containerName="rabbitmq" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.263871 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee73347-7944-41ba-b1ee-26c8c95f3be6" containerName="cloudkitty-storageinit" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.264929 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.271215 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8rn57" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.271551 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.271802 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.272051 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.272234 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.272372 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.273137 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.291260 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.368661 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.368722 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.368751 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.368791 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.368810 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.368842 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjb77\" (UniqueName: \"kubernetes.io/projected/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-kube-api-access-gjb77\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.368884 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.368914 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.368940 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-config-data\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.368961 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.368992 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.433769 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9ebad40-444e-4250-85cb-2a154282cdf9" path="/var/lib/kubelet/pods/b9ebad40-444e-4250-85cb-2a154282cdf9/volumes" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.458687 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.471975 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.472030 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.472069 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.472086 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.472117 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjb77\" (UniqueName: \"kubernetes.io/projected/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-kube-api-access-gjb77\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.472155 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.472192 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.472217 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-config-data\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.472234 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.472266 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.472363 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.472855 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.474026 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.483447 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.484561 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.491033 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-config-data\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.493025 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.498603 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.503599 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.507857 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.517607 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjb77\" (UniqueName: \"kubernetes.io/projected/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-kube-api-access-gjb77\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.521114 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4779f4bd-7580-49e7-b536-ce3b8c77a8d4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.544072 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.545835 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.551328 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.551397 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/da5caa1ff1e373df82c469164e0fcbd65bc5959bf5874e46dcc56bb65d0f7f87/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.557381 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.563135 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.563401 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lw4h5" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.563571 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.563722 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.563846 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.563963 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.564107 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.678764 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab7e006f-0788-42e5-aee9-543e29514c09-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.678817 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b47931ca-1102-49a2-a86d-b68c8818831a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47931ca-1102-49a2-a86d-b68c8818831a\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.678859 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab7e006f-0788-42e5-aee9-543e29514c09-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.678918 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab7e006f-0788-42e5-aee9-543e29514c09-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.678953 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab7e006f-0788-42e5-aee9-543e29514c09-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.679088 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-956mp\" (UniqueName: \"kubernetes.io/projected/ab7e006f-0788-42e5-aee9-543e29514c09-kube-api-access-956mp\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.679125 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab7e006f-0788-42e5-aee9-543e29514c09-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.679143 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab7e006f-0788-42e5-aee9-543e29514c09-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.679170 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab7e006f-0788-42e5-aee9-543e29514c09-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.679187 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab7e006f-0788-42e5-aee9-543e29514c09-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.679209 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab7e006f-0788-42e5-aee9-543e29514c09-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.737162 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-af1eaef1-daae-4ccb-98fd-e18e5ba36757\") pod \"rabbitmq-server-0\" (UID: \"4779f4bd-7580-49e7-b536-ce3b8c77a8d4\") " pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.780748 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab7e006f-0788-42e5-aee9-543e29514c09-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.780867 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-956mp\" (UniqueName: \"kubernetes.io/projected/ab7e006f-0788-42e5-aee9-543e29514c09-kube-api-access-956mp\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.780900 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab7e006f-0788-42e5-aee9-543e29514c09-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.780922 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab7e006f-0788-42e5-aee9-543e29514c09-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.780950 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab7e006f-0788-42e5-aee9-543e29514c09-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.780971 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab7e006f-0788-42e5-aee9-543e29514c09-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.780988 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab7e006f-0788-42e5-aee9-543e29514c09-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.781009 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab7e006f-0788-42e5-aee9-543e29514c09-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.781030 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b47931ca-1102-49a2-a86d-b68c8818831a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47931ca-1102-49a2-a86d-b68c8818831a\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.781065 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab7e006f-0788-42e5-aee9-543e29514c09-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.781120 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab7e006f-0788-42e5-aee9-543e29514c09-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.781611 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab7e006f-0788-42e5-aee9-543e29514c09-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.785800 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab7e006f-0788-42e5-aee9-543e29514c09-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.786056 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab7e006f-0788-42e5-aee9-543e29514c09-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.786354 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab7e006f-0788-42e5-aee9-543e29514c09-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.786854 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab7e006f-0788-42e5-aee9-543e29514c09-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.787336 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab7e006f-0788-42e5-aee9-543e29514c09-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.790113 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab7e006f-0788-42e5-aee9-543e29514c09-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.790451 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab7e006f-0788-42e5-aee9-543e29514c09-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.792904 4937 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.792941 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b47931ca-1102-49a2-a86d-b68c8818831a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47931ca-1102-49a2-a86d-b68c8818831a\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fd27c87e194eec4e867ab6a57fac5224d596b3e0e40be0379bb0a1ae088b7613/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.793101 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab7e006f-0788-42e5-aee9-543e29514c09-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.814592 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-956mp\" (UniqueName: \"kubernetes.io/projected/ab7e006f-0788-42e5-aee9-543e29514c09-kube-api-access-956mp\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.872901 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b47931ca-1102-49a2-a86d-b68c8818831a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47931ca-1102-49a2-a86d-b68c8818831a\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab7e006f-0788-42e5-aee9-543e29514c09\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.896998 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 16:16:29 crc kubenswrapper[4937]: I0225 16:16:29.906078 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:16:30 crc kubenswrapper[4937]: I0225 16:16:30.176218 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d9d51be-46d2-4d06-8f81-f34e8693e52d","Type":"ContainerStarted","Data":"3ce44c8ba840b876b95ef976b99bbe9d9119870d14f7b8dab1f1ab38fd5c87bd"} Feb 25 16:16:30 crc kubenswrapper[4937]: I0225 16:16:30.185730 4937 generic.go:334] "Generic (PLEG): container finished" podID="2577b339-c9c0-4e63-afa1-c0b2fb7177b4" containerID="40453a9b19a803b2d1da79a4068132eda37db5e9c992255359e0e584ea855f8c" exitCode=143 Feb 25 16:16:30 crc kubenswrapper[4937]: I0225 16:16:30.185842 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2577b339-c9c0-4e63-afa1-c0b2fb7177b4","Type":"ContainerDied","Data":"40453a9b19a803b2d1da79a4068132eda37db5e9c992255359e0e584ea855f8c"} Feb 25 16:16:30 crc kubenswrapper[4937]: I0225 16:16:30.187739 4937 generic.go:334] "Generic (PLEG): container finished" podID="1aaa1053-5b44-458d-aa42-a9804344d2e3" containerID="312d0b6618c2c81c82d5415d1326e0f815b0f6ac2bb96de77fd0afd3090e1701" exitCode=0 Feb 25 16:16:30 crc kubenswrapper[4937]: I0225 16:16:30.187782 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"1aaa1053-5b44-458d-aa42-a9804344d2e3","Type":"ContainerDied","Data":"312d0b6618c2c81c82d5415d1326e0f815b0f6ac2bb96de77fd0afd3090e1701"} Feb 25 16:16:30 crc kubenswrapper[4937]: I0225 16:16:30.436207 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 16:16:30 crc kubenswrapper[4937]: I0225 16:16:30.541324 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 16:16:30 crc kubenswrapper[4937]: W0225 16:16:30.551719 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4779f4bd_7580_49e7_b536_ce3b8c77a8d4.slice/crio-63500acd284ab12a6f33ee5e4ef8a988dc347bc42cdc3b093272eb33a6926cef WatchSource:0}: Error finding container 63500acd284ab12a6f33ee5e4ef8a988dc347bc42cdc3b093272eb33a6926cef: Status 404 returned error can't find the container with id 63500acd284ab12a6f33ee5e4ef8a988dc347bc42cdc3b093272eb33a6926cef Feb 25 16:16:30 crc kubenswrapper[4937]: I0225 16:16:30.988405 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.140381 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-scripts\") pod \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.140474 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49t5d\" (UniqueName: \"kubernetes.io/projected/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-kube-api-access-49t5d\") pod \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.140587 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-config-data-custom\") pod \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.140685 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-logs\") pod \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.140765 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-public-tls-certs\") pod \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.140786 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-internal-tls-certs\") pod \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.140830 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-config-data\") pod \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.140984 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-certs\") pod \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.141047 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-combined-ca-bundle\") pod \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\" (UID: \"2577b339-c9c0-4e63-afa1-c0b2fb7177b4\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.142103 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-logs" (OuterVolumeSpecName: "logs") pod "2577b339-c9c0-4e63-afa1-c0b2fb7177b4" (UID: "2577b339-c9c0-4e63-afa1-c0b2fb7177b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.164551 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-scripts" (OuterVolumeSpecName: "scripts") pod "2577b339-c9c0-4e63-afa1-c0b2fb7177b4" (UID: "2577b339-c9c0-4e63-afa1-c0b2fb7177b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.164976 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-certs" (OuterVolumeSpecName: "certs") pod "2577b339-c9c0-4e63-afa1-c0b2fb7177b4" (UID: "2577b339-c9c0-4e63-afa1-c0b2fb7177b4"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.178170 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-kube-api-access-49t5d" (OuterVolumeSpecName: "kube-api-access-49t5d") pod "2577b339-c9c0-4e63-afa1-c0b2fb7177b4" (UID: "2577b339-c9c0-4e63-afa1-c0b2fb7177b4"). InnerVolumeSpecName "kube-api-access-49t5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.178833 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2577b339-c9c0-4e63-afa1-c0b2fb7177b4" (UID: "2577b339-c9c0-4e63-afa1-c0b2fb7177b4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.210732 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4779f4bd-7580-49e7-b536-ce3b8c77a8d4","Type":"ContainerStarted","Data":"63500acd284ab12a6f33ee5e4ef8a988dc347bc42cdc3b093272eb33a6926cef"} Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.212161 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab7e006f-0788-42e5-aee9-543e29514c09","Type":"ContainerStarted","Data":"61435d3571d5f56c632d59101152125eccc5e92f732b097893f93db35c1a4b49"} Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.222789 4937 generic.go:334] "Generic (PLEG): container finished" podID="2577b339-c9c0-4e63-afa1-c0b2fb7177b4" containerID="73834f0d72d100f7acfc9c3e1aaa128281b7a8d3463254ff25e22c106583027d" exitCode=0 Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.222858 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2577b339-c9c0-4e63-afa1-c0b2fb7177b4","Type":"ContainerDied","Data":"73834f0d72d100f7acfc9c3e1aaa128281b7a8d3463254ff25e22c106583027d"} Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.222895 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2577b339-c9c0-4e63-afa1-c0b2fb7177b4","Type":"ContainerDied","Data":"20c0dee27177aadaf0e1776572c1df5c65beebb2c40617a80b58ad229cd71592"} Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.222917 4937 scope.go:117] "RemoveContainer" containerID="73834f0d72d100f7acfc9c3e1aaa128281b7a8d3463254ff25e22c106583027d" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.222862 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.243595 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.243632 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49t5d\" (UniqueName: \"kubernetes.io/projected/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-kube-api-access-49t5d\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.243644 4937 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.243656 4937 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-logs\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.243666 4937 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.278736 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2577b339-c9c0-4e63-afa1-c0b2fb7177b4" (UID: "2577b339-c9c0-4e63-afa1-c0b2fb7177b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.328024 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-config-data" (OuterVolumeSpecName: "config-data") pod "2577b339-c9c0-4e63-afa1-c0b2fb7177b4" (UID: "2577b339-c9c0-4e63-afa1-c0b2fb7177b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.341027 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2577b339-c9c0-4e63-afa1-c0b2fb7177b4" (UID: "2577b339-c9c0-4e63-afa1-c0b2fb7177b4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.353459 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.359208 4937 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.359240 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.397874 4937 scope.go:117] "RemoveContainer" containerID="40453a9b19a803b2d1da79a4068132eda37db5e9c992255359e0e584ea855f8c" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.397977 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.400214 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de5b4144-33d4-4860-9872-8826c78490a7" path="/var/lib/kubelet/pods/de5b4144-33d4-4860-9872-8826c78490a7/volumes" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.410053 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-dl8gl"] Feb 25 16:16:31 crc kubenswrapper[4937]: E0225 16:16:31.411011 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2577b339-c9c0-4e63-afa1-c0b2fb7177b4" containerName="cloudkitty-api" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.411035 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2577b339-c9c0-4e63-afa1-c0b2fb7177b4" containerName="cloudkitty-api" Feb 25 16:16:31 crc kubenswrapper[4937]: E0225 16:16:31.411078 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aaa1053-5b44-458d-aa42-a9804344d2e3" containerName="cloudkitty-proc" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.411087 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aaa1053-5b44-458d-aa42-a9804344d2e3" containerName="cloudkitty-proc" Feb 25 16:16:31 crc kubenswrapper[4937]: E0225 16:16:31.411148 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2577b339-c9c0-4e63-afa1-c0b2fb7177b4" containerName="cloudkitty-api-log" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.411158 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2577b339-c9c0-4e63-afa1-c0b2fb7177b4" containerName="cloudkitty-api-log" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.411579 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="2577b339-c9c0-4e63-afa1-c0b2fb7177b4" containerName="cloudkitty-api-log" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.411609 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="2577b339-c9c0-4e63-afa1-c0b2fb7177b4" containerName="cloudkitty-api" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.411655 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aaa1053-5b44-458d-aa42-a9804344d2e3" containerName="cloudkitty-proc" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.413594 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.413700 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2577b339-c9c0-4e63-afa1-c0b2fb7177b4" (UID: "2577b339-c9c0-4e63-afa1-c0b2fb7177b4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.415351 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-dl8gl"] Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.422282 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.435697 4937 scope.go:117] "RemoveContainer" containerID="73834f0d72d100f7acfc9c3e1aaa128281b7a8d3463254ff25e22c106583027d" Feb 25 16:16:31 crc kubenswrapper[4937]: E0225 16:16:31.439618 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73834f0d72d100f7acfc9c3e1aaa128281b7a8d3463254ff25e22c106583027d\": container with ID starting with 73834f0d72d100f7acfc9c3e1aaa128281b7a8d3463254ff25e22c106583027d not found: ID does not exist" containerID="73834f0d72d100f7acfc9c3e1aaa128281b7a8d3463254ff25e22c106583027d" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.439666 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73834f0d72d100f7acfc9c3e1aaa128281b7a8d3463254ff25e22c106583027d"} err="failed to get container status \"73834f0d72d100f7acfc9c3e1aaa128281b7a8d3463254ff25e22c106583027d\": rpc error: code = NotFound desc = could not find container \"73834f0d72d100f7acfc9c3e1aaa128281b7a8d3463254ff25e22c106583027d\": container with ID starting with 73834f0d72d100f7acfc9c3e1aaa128281b7a8d3463254ff25e22c106583027d not found: ID does not exist" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.439694 4937 scope.go:117] "RemoveContainer" containerID="40453a9b19a803b2d1da79a4068132eda37db5e9c992255359e0e584ea855f8c" Feb 25 16:16:31 crc kubenswrapper[4937]: E0225 16:16:31.440069 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40453a9b19a803b2d1da79a4068132eda37db5e9c992255359e0e584ea855f8c\": container with ID starting with 40453a9b19a803b2d1da79a4068132eda37db5e9c992255359e0e584ea855f8c not found: ID does not exist" containerID="40453a9b19a803b2d1da79a4068132eda37db5e9c992255359e0e584ea855f8c" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.440099 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40453a9b19a803b2d1da79a4068132eda37db5e9c992255359e0e584ea855f8c"} err="failed to get container status \"40453a9b19a803b2d1da79a4068132eda37db5e9c992255359e0e584ea855f8c\": rpc error: code = NotFound desc = could not find container \"40453a9b19a803b2d1da79a4068132eda37db5e9c992255359e0e584ea855f8c\": container with ID starting with 40453a9b19a803b2d1da79a4068132eda37db5e9c992255359e0e584ea855f8c not found: ID does not exist" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.464124 4937 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2577b339-c9c0-4e63-afa1-c0b2fb7177b4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.563821 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.564957 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1aaa1053-5b44-458d-aa42-a9804344d2e3-certs\") pod \"1aaa1053-5b44-458d-aa42-a9804344d2e3\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.565038 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-scripts\") pod \"1aaa1053-5b44-458d-aa42-a9804344d2e3\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.565387 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-combined-ca-bundle\") pod \"1aaa1053-5b44-458d-aa42-a9804344d2e3\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.565563 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4nqz\" (UniqueName: \"kubernetes.io/projected/1aaa1053-5b44-458d-aa42-a9804344d2e3-kube-api-access-z4nqz\") pod \"1aaa1053-5b44-458d-aa42-a9804344d2e3\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.565676 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-config-data\") pod \"1aaa1053-5b44-458d-aa42-a9804344d2e3\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.565763 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-config-data-custom\") pod \"1aaa1053-5b44-458d-aa42-a9804344d2e3\" (UID: \"1aaa1053-5b44-458d-aa42-a9804344d2e3\") " Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.566197 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vrdt\" (UniqueName: \"kubernetes.io/projected/93800d0c-adbc-40bd-a802-7ad6027309a5-kube-api-access-8vrdt\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.571479 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-config\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.574922 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aaa1053-5b44-458d-aa42-a9804344d2e3-kube-api-access-z4nqz" (OuterVolumeSpecName: "kube-api-access-z4nqz") pod "1aaa1053-5b44-458d-aa42-a9804344d2e3" (UID: "1aaa1053-5b44-458d-aa42-a9804344d2e3"). InnerVolumeSpecName "kube-api-access-z4nqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.577529 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.577626 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.577799 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.578211 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aaa1053-5b44-458d-aa42-a9804344d2e3-certs" (OuterVolumeSpecName: "certs") pod "1aaa1053-5b44-458d-aa42-a9804344d2e3" (UID: "1aaa1053-5b44-458d-aa42-a9804344d2e3"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.579164 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.579751 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.580011 4937 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1aaa1053-5b44-458d-aa42-a9804344d2e3-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.580028 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4nqz\" (UniqueName: \"kubernetes.io/projected/1aaa1053-5b44-458d-aa42-a9804344d2e3-kube-api-access-z4nqz\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.597736 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1aaa1053-5b44-458d-aa42-a9804344d2e3" (UID: "1aaa1053-5b44-458d-aa42-a9804344d2e3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.599083 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-scripts" (OuterVolumeSpecName: "scripts") pod "1aaa1053-5b44-458d-aa42-a9804344d2e3" (UID: "1aaa1053-5b44-458d-aa42-a9804344d2e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.608813 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.633515 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.644582 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.645334 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.650034 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.650442 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.652910 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.681411 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-scripts\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.681737 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.681843 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.681946 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.682027 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-config-data\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.682093 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.682180 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.682263 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5787788d-bec1-4541-a34d-26ab6b7f4aa5-logs\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.682351 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5787788d-bec1-4541-a34d-26ab6b7f4aa5-certs\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.682498 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vrdt\" (UniqueName: \"kubernetes.io/projected/93800d0c-adbc-40bd-a802-7ad6027309a5-kube-api-access-8vrdt\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.682580 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-config\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.682666 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.682753 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.682818 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.682896 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.682968 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxrlq\" (UniqueName: \"kubernetes.io/projected/5787788d-bec1-4541-a34d-26ab6b7f4aa5-kube-api-access-sxrlq\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.683098 4937 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.683166 4937 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.684275 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-config\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.684382 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.684983 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.685844 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.686184 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.686742 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.780229 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vrdt\" (UniqueName: \"kubernetes.io/projected/93800d0c-adbc-40bd-a802-7ad6027309a5-kube-api-access-8vrdt\") pod \"dnsmasq-dns-dbb88bf8c-dl8gl\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.785206 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.785453 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxrlq\" (UniqueName: \"kubernetes.io/projected/5787788d-bec1-4541-a34d-26ab6b7f4aa5-kube-api-access-sxrlq\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.785703 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-scripts\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.785830 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.785925 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-config-data\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.786041 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.786189 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5787788d-bec1-4541-a34d-26ab6b7f4aa5-logs\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.786303 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5787788d-bec1-4541-a34d-26ab6b7f4aa5-certs\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.786508 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.789922 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5787788d-bec1-4541-a34d-26ab6b7f4aa5-logs\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.792128 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.792680 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5787788d-bec1-4541-a34d-26ab6b7f4aa5-certs\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.792719 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-scripts\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.792846 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-config-data\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.793361 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.793871 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.803476 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxrlq\" (UniqueName: \"kubernetes.io/projected/5787788d-bec1-4541-a34d-26ab6b7f4aa5-kube-api-access-sxrlq\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.879473 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5787788d-bec1-4541-a34d-26ab6b7f4aa5-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"5787788d-bec1-4541-a34d-26ab6b7f4aa5\") " pod="openstack/cloudkitty-api-0" Feb 25 16:16:31 crc kubenswrapper[4937]: I0225 16:16:31.962968 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.041803 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.189193 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-config-data" (OuterVolumeSpecName: "config-data") pod "1aaa1053-5b44-458d-aa42-a9804344d2e3" (UID: "1aaa1053-5b44-458d-aa42-a9804344d2e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.195362 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.199771 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1aaa1053-5b44-458d-aa42-a9804344d2e3" (UID: "1aaa1053-5b44-458d-aa42-a9804344d2e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.291945 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"1aaa1053-5b44-458d-aa42-a9804344d2e3","Type":"ContainerDied","Data":"8eca9ea59eee3866e4275d6f72f464108f930c75e50e567c6f08c006d00a3e48"} Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.292003 4937 scope.go:117] "RemoveContainer" containerID="312d0b6618c2c81c82d5415d1326e0f815b0f6ac2bb96de77fd0afd3090e1701" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.292133 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.306677 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aaa1053-5b44-458d-aa42-a9804344d2e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.387057 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.396604 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.440712 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.442319 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.448933 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.490868 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.578623 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.628280 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d41b7062-11e8-401a-a063-8467cf1da4f2-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.628385 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41b7062-11e8-401a-a063-8467cf1da4f2-config-data\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.628442 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41b7062-11e8-401a-a063-8467cf1da4f2-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.628476 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d41b7062-11e8-401a-a063-8467cf1da4f2-certs\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.628610 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41b7062-11e8-401a-a063-8467cf1da4f2-scripts\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.628629 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-755tc\" (UniqueName: \"kubernetes.io/projected/d41b7062-11e8-401a-a063-8467cf1da4f2-kube-api-access-755tc\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.730471 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d41b7062-11e8-401a-a063-8467cf1da4f2-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.730606 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41b7062-11e8-401a-a063-8467cf1da4f2-config-data\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.730654 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41b7062-11e8-401a-a063-8467cf1da4f2-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.730681 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d41b7062-11e8-401a-a063-8467cf1da4f2-certs\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.730746 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41b7062-11e8-401a-a063-8467cf1da4f2-scripts\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.730763 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-755tc\" (UniqueName: \"kubernetes.io/projected/d41b7062-11e8-401a-a063-8467cf1da4f2-kube-api-access-755tc\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.738445 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41b7062-11e8-401a-a063-8467cf1da4f2-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.738663 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d41b7062-11e8-401a-a063-8467cf1da4f2-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.740018 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41b7062-11e8-401a-a063-8467cf1da4f2-config-data\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.740782 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d41b7062-11e8-401a-a063-8467cf1da4f2-certs\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.741011 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d41b7062-11e8-401a-a063-8467cf1da4f2-scripts\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.756626 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-755tc\" (UniqueName: \"kubernetes.io/projected/d41b7062-11e8-401a-a063-8467cf1da4f2-kube-api-access-755tc\") pod \"cloudkitty-proc-0\" (UID: \"d41b7062-11e8-401a-a063-8467cf1da4f2\") " pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.791329 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 25 16:16:32 crc kubenswrapper[4937]: I0225 16:16:32.858625 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-dl8gl"] Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.250301 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 25 16:16:33 crc kubenswrapper[4937]: W0225 16:16:33.258470 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd41b7062_11e8_401a_a063_8467cf1da4f2.slice/crio-bb691f9a341790d4f829ae3e037c3dda06d7dd0dd13aba6115239d1239bd1e0d WatchSource:0}: Error finding container bb691f9a341790d4f829ae3e037c3dda06d7dd0dd13aba6115239d1239bd1e0d: Status 404 returned error can't find the container with id bb691f9a341790d4f829ae3e037c3dda06d7dd0dd13aba6115239d1239bd1e0d Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.301968 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"d41b7062-11e8-401a-a063-8467cf1da4f2","Type":"ContainerStarted","Data":"bb691f9a341790d4f829ae3e037c3dda06d7dd0dd13aba6115239d1239bd1e0d"} Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.303740 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" event={"ID":"93800d0c-adbc-40bd-a802-7ad6027309a5","Type":"ContainerStarted","Data":"1be7d87dd8f4804e387e3fc4cb00a3337f992d19231a7956ad39ad64a9eb769e"} Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.303782 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" event={"ID":"93800d0c-adbc-40bd-a802-7ad6027309a5","Type":"ContainerStarted","Data":"2bdeeb08c521e0da1ee9c0e773c5f06f631c9c0d73f065b4aef09fb7acb254ef"} Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.305616 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4779f4bd-7580-49e7-b536-ce3b8c77a8d4","Type":"ContainerStarted","Data":"0f08435f515bfb3b7b59822e00edd5da97f82b5d2ff86fbf441687886c4378bd"} Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.308346 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d9d51be-46d2-4d06-8f81-f34e8693e52d","Type":"ContainerStarted","Data":"8cbfa952e47ff3b10139a745ce01446016c840fa3aafbf96ceb44d1d4e1fe6c8"} Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.308496 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.310593 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"5787788d-bec1-4541-a34d-26ab6b7f4aa5","Type":"ContainerStarted","Data":"13091bc91fc88cd9cd119aec1fdc2e165fe484865a388dcfe984c899f890eeb0"} Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.310631 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"5787788d-bec1-4541-a34d-26ab6b7f4aa5","Type":"ContainerStarted","Data":"be634c63071024800ec0f387df4eaf3fc71be25a1c757b1635c154e9e3fb39bf"} Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.310657 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"5787788d-bec1-4541-a34d-26ab6b7f4aa5","Type":"ContainerStarted","Data":"3f8fe322e7248ab56bf9fe15db11dcb5631c89a6c2df43a5ce70ecbbeeea5b22"} Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.310730 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.313169 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab7e006f-0788-42e5-aee9-543e29514c09","Type":"ContainerStarted","Data":"f84b5405a78b370c5fa92a522510fced3753c79bf05e459eaa67927076d945a6"} Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.378209 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aaa1053-5b44-458d-aa42-a9804344d2e3" path="/var/lib/kubelet/pods/1aaa1053-5b44-458d-aa42-a9804344d2e3/volumes" Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.378965 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2577b339-c9c0-4e63-afa1-c0b2fb7177b4" path="/var/lib/kubelet/pods/2577b339-c9c0-4e63-afa1-c0b2fb7177b4/volumes" Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.385726 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.521100755 podStartE2EDuration="11.385706502s" podCreationTimestamp="2026-02-25 16:16:22 +0000 UTC" firstStartedPulling="2026-02-25 16:16:23.803090176 +0000 UTC m=+1834.816482066" lastFinishedPulling="2026-02-25 16:16:31.667695933 +0000 UTC m=+1842.681087813" observedRunningTime="2026-02-25 16:16:33.373426305 +0000 UTC m=+1844.386818195" watchObservedRunningTime="2026-02-25 16:16:33.385706502 +0000 UTC m=+1844.399098392" Feb 25 16:16:33 crc kubenswrapper[4937]: I0225 16:16:33.406870 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.4068471320000002 podStartE2EDuration="2.406847132s" podCreationTimestamp="2026-02-25 16:16:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:16:33.397407916 +0000 UTC m=+1844.410799826" watchObservedRunningTime="2026-02-25 16:16:33.406847132 +0000 UTC m=+1844.420239022" Feb 25 16:16:34 crc kubenswrapper[4937]: I0225 16:16:34.336500 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"d41b7062-11e8-401a-a063-8467cf1da4f2","Type":"ContainerStarted","Data":"9ecbd8d18f25a488db66cc70b1c00f83d15e58f2e1c2e46820a699a8fc9717da"} Feb 25 16:16:34 crc kubenswrapper[4937]: I0225 16:16:34.340541 4937 generic.go:334] "Generic (PLEG): container finished" podID="93800d0c-adbc-40bd-a802-7ad6027309a5" containerID="1be7d87dd8f4804e387e3fc4cb00a3337f992d19231a7956ad39ad64a9eb769e" exitCode=0 Feb 25 16:16:34 crc kubenswrapper[4937]: I0225 16:16:34.342051 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" event={"ID":"93800d0c-adbc-40bd-a802-7ad6027309a5","Type":"ContainerDied","Data":"1be7d87dd8f4804e387e3fc4cb00a3337f992d19231a7956ad39ad64a9eb769e"} Feb 25 16:16:34 crc kubenswrapper[4937]: I0225 16:16:34.370238 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.120185385 podStartE2EDuration="2.370214863s" podCreationTimestamp="2026-02-25 16:16:32 +0000 UTC" firstStartedPulling="2026-02-25 16:16:33.260441752 +0000 UTC m=+1844.273833642" lastFinishedPulling="2026-02-25 16:16:33.51047123 +0000 UTC m=+1844.523863120" observedRunningTime="2026-02-25 16:16:34.35892074 +0000 UTC m=+1845.372312620" watchObservedRunningTime="2026-02-25 16:16:34.370214863 +0000 UTC m=+1845.383606763" Feb 25 16:16:35 crc kubenswrapper[4937]: I0225 16:16:35.354092 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" event={"ID":"93800d0c-adbc-40bd-a802-7ad6027309a5","Type":"ContainerStarted","Data":"6f094fcbc458e798ba08ecb740ab6414729b4c01b7940c38fb298e1a376ef283"} Feb 25 16:16:35 crc kubenswrapper[4937]: I0225 16:16:35.354433 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:35 crc kubenswrapper[4937]: I0225 16:16:35.371212 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" podStartSLOduration=4.371197576 podStartE2EDuration="4.371197576s" podCreationTimestamp="2026-02-25 16:16:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:16:35.36816554 +0000 UTC m=+1846.381557430" watchObservedRunningTime="2026-02-25 16:16:35.371197576 +0000 UTC m=+1846.384589466" Feb 25 16:16:37 crc kubenswrapper[4937]: I0225 16:16:37.055477 4937 scope.go:117] "RemoveContainer" containerID="3c92e8885d8aa5b20c72edf65a0d619e10ed84ce6ea98e7f185ac4ccdd27e091" Feb 25 16:16:37 crc kubenswrapper[4937]: I0225 16:16:37.370121 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:16:37 crc kubenswrapper[4937]: E0225 16:16:37.370377 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.044823 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.154161 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-bkv5w"] Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.154774 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" podUID="f5fe867a-9b48-43f5-b336-107761af2328" containerName="dnsmasq-dns" containerID="cri-o://dd771656c0f5225b3d8beb28d9f0cbec718d26324c36001d17cfbe8e4d8740e2" gracePeriod=10 Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.319972 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-hjwwn"] Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.324782 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.334017 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-hjwwn"] Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.444076 4937 generic.go:334] "Generic (PLEG): container finished" podID="f5fe867a-9b48-43f5-b336-107761af2328" containerID="dd771656c0f5225b3d8beb28d9f0cbec718d26324c36001d17cfbe8e4d8740e2" exitCode=0 Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.444117 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" event={"ID":"f5fe867a-9b48-43f5-b336-107761af2328","Type":"ContainerDied","Data":"dd771656c0f5225b3d8beb28d9f0cbec718d26324c36001d17cfbe8e4d8740e2"} Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.467575 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.467643 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xzv2\" (UniqueName: \"kubernetes.io/projected/4ec5a514-9a47-413c-8e18-113b8295e0b7-kube-api-access-9xzv2\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.468018 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.468119 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-config\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.468157 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-dns-svc\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.468222 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.468347 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.569753 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.570112 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xzv2\" (UniqueName: \"kubernetes.io/projected/4ec5a514-9a47-413c-8e18-113b8295e0b7-kube-api-access-9xzv2\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.570409 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.570443 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-config\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.570662 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.571165 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.571242 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-dns-svc\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.571282 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.571340 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.571420 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-config\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.572260 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.572299 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-dns-svc\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.572608 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ec5a514-9a47-413c-8e18-113b8295e0b7-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.595856 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xzv2\" (UniqueName: \"kubernetes.io/projected/4ec5a514-9a47-413c-8e18-113b8295e0b7-kube-api-access-9xzv2\") pod \"dnsmasq-dns-85f64749dc-hjwwn\" (UID: \"4ec5a514-9a47-413c-8e18-113b8295e0b7\") " pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.661028 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.790166 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.979472 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-config\") pod \"f5fe867a-9b48-43f5-b336-107761af2328\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.979953 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-dns-swift-storage-0\") pod \"f5fe867a-9b48-43f5-b336-107761af2328\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.980263 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-ovsdbserver-nb\") pod \"f5fe867a-9b48-43f5-b336-107761af2328\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.980417 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-ovsdbserver-sb\") pod \"f5fe867a-9b48-43f5-b336-107761af2328\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.980873 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbqb4\" (UniqueName: \"kubernetes.io/projected/f5fe867a-9b48-43f5-b336-107761af2328-kube-api-access-fbqb4\") pod \"f5fe867a-9b48-43f5-b336-107761af2328\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.980940 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-dns-svc\") pod \"f5fe867a-9b48-43f5-b336-107761af2328\" (UID: \"f5fe867a-9b48-43f5-b336-107761af2328\") " Feb 25 16:16:42 crc kubenswrapper[4937]: I0225 16:16:42.985731 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5fe867a-9b48-43f5-b336-107761af2328-kube-api-access-fbqb4" (OuterVolumeSpecName: "kube-api-access-fbqb4") pod "f5fe867a-9b48-43f5-b336-107761af2328" (UID: "f5fe867a-9b48-43f5-b336-107761af2328"). InnerVolumeSpecName "kube-api-access-fbqb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.037141 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-config" (OuterVolumeSpecName: "config") pod "f5fe867a-9b48-43f5-b336-107761af2328" (UID: "f5fe867a-9b48-43f5-b336-107761af2328"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.043461 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f5fe867a-9b48-43f5-b336-107761af2328" (UID: "f5fe867a-9b48-43f5-b336-107761af2328"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.053761 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f5fe867a-9b48-43f5-b336-107761af2328" (UID: "f5fe867a-9b48-43f5-b336-107761af2328"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.061187 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f5fe867a-9b48-43f5-b336-107761af2328" (UID: "f5fe867a-9b48-43f5-b336-107761af2328"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.071048 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f5fe867a-9b48-43f5-b336-107761af2328" (UID: "f5fe867a-9b48-43f5-b336-107761af2328"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.083714 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.083744 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbqb4\" (UniqueName: \"kubernetes.io/projected/f5fe867a-9b48-43f5-b336-107761af2328-kube-api-access-fbqb4\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.083756 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.083765 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.083774 4937 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.083782 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5fe867a-9b48-43f5-b336-107761af2328-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.156787 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-hjwwn"] Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.457138 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" event={"ID":"4ec5a514-9a47-413c-8e18-113b8295e0b7","Type":"ContainerStarted","Data":"1ec69883ec0fb3e4f86f7ab356dc18fd643e1c4c982d7b2475ffef3505914ea1"} Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.457807 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" event={"ID":"4ec5a514-9a47-413c-8e18-113b8295e0b7","Type":"ContainerStarted","Data":"3a6abb5dd9783864a81bd70cf44f73216c7cb1cac1e20c5057690d699567ee8e"} Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.458990 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" event={"ID":"f5fe867a-9b48-43f5-b336-107761af2328","Type":"ContainerDied","Data":"89976c099320a009c27438e24538906088c48c55ca9eb7de0b81f2c1915f3e3e"} Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.459081 4937 scope.go:117] "RemoveContainer" containerID="dd771656c0f5225b3d8beb28d9f0cbec718d26324c36001d17cfbe8e4d8740e2" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.459200 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-bkv5w" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.522042 4937 scope.go:117] "RemoveContainer" containerID="dedd5a7aa711e5d16d70d5f5a0c88c9388e40acb4df1aa76452fcd6fba141d47" Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.531890 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-bkv5w"] Feb 25 16:16:43 crc kubenswrapper[4937]: I0225 16:16:43.542671 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-bkv5w"] Feb 25 16:16:44 crc kubenswrapper[4937]: I0225 16:16:44.476529 4937 generic.go:334] "Generic (PLEG): container finished" podID="4ec5a514-9a47-413c-8e18-113b8295e0b7" containerID="1ec69883ec0fb3e4f86f7ab356dc18fd643e1c4c982d7b2475ffef3505914ea1" exitCode=0 Feb 25 16:16:44 crc kubenswrapper[4937]: I0225 16:16:44.476603 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" event={"ID":"4ec5a514-9a47-413c-8e18-113b8295e0b7","Type":"ContainerDied","Data":"1ec69883ec0fb3e4f86f7ab356dc18fd643e1c4c982d7b2475ffef3505914ea1"} Feb 25 16:16:45 crc kubenswrapper[4937]: I0225 16:16:45.383783 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5fe867a-9b48-43f5-b336-107761af2328" path="/var/lib/kubelet/pods/f5fe867a-9b48-43f5-b336-107761af2328/volumes" Feb 25 16:16:45 crc kubenswrapper[4937]: I0225 16:16:45.490300 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" event={"ID":"4ec5a514-9a47-413c-8e18-113b8295e0b7","Type":"ContainerStarted","Data":"3aab6eebd46a25351016c9f2203dd271ab65532fa7394b07e0f588b4572b17e5"} Feb 25 16:16:45 crc kubenswrapper[4937]: I0225 16:16:45.490418 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:49 crc kubenswrapper[4937]: I0225 16:16:49.369518 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:16:49 crc kubenswrapper[4937]: E0225 16:16:49.370716 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:16:52 crc kubenswrapper[4937]: I0225 16:16:52.662736 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" Feb 25 16:16:52 crc kubenswrapper[4937]: I0225 16:16:52.689121 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f64749dc-hjwwn" podStartSLOduration=10.689078898 podStartE2EDuration="10.689078898s" podCreationTimestamp="2026-02-25 16:16:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:16:45.519125415 +0000 UTC m=+1856.532517345" watchObservedRunningTime="2026-02-25 16:16:52.689078898 +0000 UTC m=+1863.702470838" Feb 25 16:16:52 crc kubenswrapper[4937]: I0225 16:16:52.733362 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-dl8gl"] Feb 25 16:16:52 crc kubenswrapper[4937]: I0225 16:16:52.733719 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" podUID="93800d0c-adbc-40bd-a802-7ad6027309a5" containerName="dnsmasq-dns" containerID="cri-o://6f094fcbc458e798ba08ecb740ab6414729b4c01b7940c38fb298e1a376ef283" gracePeriod=10 Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.320968 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.321436 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.434780 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-dns-svc\") pod \"93800d0c-adbc-40bd-a802-7ad6027309a5\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.435054 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-config\") pod \"93800d0c-adbc-40bd-a802-7ad6027309a5\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.435149 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-dns-swift-storage-0\") pod \"93800d0c-adbc-40bd-a802-7ad6027309a5\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.435264 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-ovsdbserver-sb\") pod \"93800d0c-adbc-40bd-a802-7ad6027309a5\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.435302 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vrdt\" (UniqueName: \"kubernetes.io/projected/93800d0c-adbc-40bd-a802-7ad6027309a5-kube-api-access-8vrdt\") pod \"93800d0c-adbc-40bd-a802-7ad6027309a5\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.435418 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-openstack-edpm-ipam\") pod \"93800d0c-adbc-40bd-a802-7ad6027309a5\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.436340 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-ovsdbserver-nb\") pod \"93800d0c-adbc-40bd-a802-7ad6027309a5\" (UID: \"93800d0c-adbc-40bd-a802-7ad6027309a5\") " Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.465160 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93800d0c-adbc-40bd-a802-7ad6027309a5-kube-api-access-8vrdt" (OuterVolumeSpecName: "kube-api-access-8vrdt") pod "93800d0c-adbc-40bd-a802-7ad6027309a5" (UID: "93800d0c-adbc-40bd-a802-7ad6027309a5"). InnerVolumeSpecName "kube-api-access-8vrdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.494955 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "93800d0c-adbc-40bd-a802-7ad6027309a5" (UID: "93800d0c-adbc-40bd-a802-7ad6027309a5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.501721 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "93800d0c-adbc-40bd-a802-7ad6027309a5" (UID: "93800d0c-adbc-40bd-a802-7ad6027309a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.507090 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "93800d0c-adbc-40bd-a802-7ad6027309a5" (UID: "93800d0c-adbc-40bd-a802-7ad6027309a5"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.526401 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-config" (OuterVolumeSpecName: "config") pod "93800d0c-adbc-40bd-a802-7ad6027309a5" (UID: "93800d0c-adbc-40bd-a802-7ad6027309a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.533823 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93800d0c-adbc-40bd-a802-7ad6027309a5" (UID: "93800d0c-adbc-40bd-a802-7ad6027309a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.534838 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "93800d0c-adbc-40bd-a802-7ad6027309a5" (UID: "93800d0c-adbc-40bd-a802-7ad6027309a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.540613 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.540661 4937 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.540675 4937 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.540685 4937 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.540696 4937 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.540705 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vrdt\" (UniqueName: \"kubernetes.io/projected/93800d0c-adbc-40bd-a802-7ad6027309a5-kube-api-access-8vrdt\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.540716 4937 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/93800d0c-adbc-40bd-a802-7ad6027309a5-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.588178 4937 generic.go:334] "Generic (PLEG): container finished" podID="93800d0c-adbc-40bd-a802-7ad6027309a5" containerID="6f094fcbc458e798ba08ecb740ab6414729b4c01b7940c38fb298e1a376ef283" exitCode=0 Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.588223 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" event={"ID":"93800d0c-adbc-40bd-a802-7ad6027309a5","Type":"ContainerDied","Data":"6f094fcbc458e798ba08ecb740ab6414729b4c01b7940c38fb298e1a376ef283"} Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.588248 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" event={"ID":"93800d0c-adbc-40bd-a802-7ad6027309a5","Type":"ContainerDied","Data":"2bdeeb08c521e0da1ee9c0e773c5f06f631c9c0d73f065b4aef09fb7acb254ef"} Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.588264 4937 scope.go:117] "RemoveContainer" containerID="6f094fcbc458e798ba08ecb740ab6414729b4c01b7940c38fb298e1a376ef283" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.588534 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-dl8gl" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.620933 4937 scope.go:117] "RemoveContainer" containerID="1be7d87dd8f4804e387e3fc4cb00a3337f992d19231a7956ad39ad64a9eb769e" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.642217 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-dl8gl"] Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.651175 4937 scope.go:117] "RemoveContainer" containerID="6f094fcbc458e798ba08ecb740ab6414729b4c01b7940c38fb298e1a376ef283" Feb 25 16:16:53 crc kubenswrapper[4937]: E0225 16:16:53.651784 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f094fcbc458e798ba08ecb740ab6414729b4c01b7940c38fb298e1a376ef283\": container with ID starting with 6f094fcbc458e798ba08ecb740ab6414729b4c01b7940c38fb298e1a376ef283 not found: ID does not exist" containerID="6f094fcbc458e798ba08ecb740ab6414729b4c01b7940c38fb298e1a376ef283" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.651825 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f094fcbc458e798ba08ecb740ab6414729b4c01b7940c38fb298e1a376ef283"} err="failed to get container status \"6f094fcbc458e798ba08ecb740ab6414729b4c01b7940c38fb298e1a376ef283\": rpc error: code = NotFound desc = could not find container \"6f094fcbc458e798ba08ecb740ab6414729b4c01b7940c38fb298e1a376ef283\": container with ID starting with 6f094fcbc458e798ba08ecb740ab6414729b4c01b7940c38fb298e1a376ef283 not found: ID does not exist" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.651852 4937 scope.go:117] "RemoveContainer" containerID="1be7d87dd8f4804e387e3fc4cb00a3337f992d19231a7956ad39ad64a9eb769e" Feb 25 16:16:53 crc kubenswrapper[4937]: E0225 16:16:53.652122 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1be7d87dd8f4804e387e3fc4cb00a3337f992d19231a7956ad39ad64a9eb769e\": container with ID starting with 1be7d87dd8f4804e387e3fc4cb00a3337f992d19231a7956ad39ad64a9eb769e not found: ID does not exist" containerID="1be7d87dd8f4804e387e3fc4cb00a3337f992d19231a7956ad39ad64a9eb769e" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.652147 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1be7d87dd8f4804e387e3fc4cb00a3337f992d19231a7956ad39ad64a9eb769e"} err="failed to get container status \"1be7d87dd8f4804e387e3fc4cb00a3337f992d19231a7956ad39ad64a9eb769e\": rpc error: code = NotFound desc = could not find container \"1be7d87dd8f4804e387e3fc4cb00a3337f992d19231a7956ad39ad64a9eb769e\": container with ID starting with 1be7d87dd8f4804e387e3fc4cb00a3337f992d19231a7956ad39ad64a9eb769e not found: ID does not exist" Feb 25 16:16:53 crc kubenswrapper[4937]: I0225 16:16:53.658797 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-dl8gl"] Feb 25 16:16:55 crc kubenswrapper[4937]: I0225 16:16:55.385814 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93800d0c-adbc-40bd-a802-7ad6027309a5" path="/var/lib/kubelet/pods/93800d0c-adbc-40bd-a802-7ad6027309a5/volumes" Feb 25 16:17:02 crc kubenswrapper[4937]: I0225 16:17:02.367506 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:17:02 crc kubenswrapper[4937]: E0225 16:17:02.368346 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:17:04 crc kubenswrapper[4937]: I0225 16:17:04.760747 4937 generic.go:334] "Generic (PLEG): container finished" podID="4779f4bd-7580-49e7-b536-ce3b8c77a8d4" containerID="0f08435f515bfb3b7b59822e00edd5da97f82b5d2ff86fbf441687886c4378bd" exitCode=0 Feb 25 16:17:04 crc kubenswrapper[4937]: I0225 16:17:04.761523 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4779f4bd-7580-49e7-b536-ce3b8c77a8d4","Type":"ContainerDied","Data":"0f08435f515bfb3b7b59822e00edd5da97f82b5d2ff86fbf441687886c4378bd"} Feb 25 16:17:04 crc kubenswrapper[4937]: I0225 16:17:04.765408 4937 generic.go:334] "Generic (PLEG): container finished" podID="ab7e006f-0788-42e5-aee9-543e29514c09" containerID="f84b5405a78b370c5fa92a522510fced3753c79bf05e459eaa67927076d945a6" exitCode=0 Feb 25 16:17:04 crc kubenswrapper[4937]: I0225 16:17:04.765452 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab7e006f-0788-42e5-aee9-543e29514c09","Type":"ContainerDied","Data":"f84b5405a78b370c5fa92a522510fced3753c79bf05e459eaa67927076d945a6"} Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.747249 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j"] Feb 25 16:17:05 crc kubenswrapper[4937]: E0225 16:17:05.748343 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93800d0c-adbc-40bd-a802-7ad6027309a5" containerName="init" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.748376 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="93800d0c-adbc-40bd-a802-7ad6027309a5" containerName="init" Feb 25 16:17:05 crc kubenswrapper[4937]: E0225 16:17:05.748412 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93800d0c-adbc-40bd-a802-7ad6027309a5" containerName="dnsmasq-dns" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.748424 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="93800d0c-adbc-40bd-a802-7ad6027309a5" containerName="dnsmasq-dns" Feb 25 16:17:05 crc kubenswrapper[4937]: E0225 16:17:05.748479 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5fe867a-9b48-43f5-b336-107761af2328" containerName="dnsmasq-dns" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.748523 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5fe867a-9b48-43f5-b336-107761af2328" containerName="dnsmasq-dns" Feb 25 16:17:05 crc kubenswrapper[4937]: E0225 16:17:05.748546 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5fe867a-9b48-43f5-b336-107761af2328" containerName="init" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.748558 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5fe867a-9b48-43f5-b336-107761af2328" containerName="init" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.748972 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5fe867a-9b48-43f5-b336-107761af2328" containerName="dnsmasq-dns" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.749019 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="93800d0c-adbc-40bd-a802-7ad6027309a5" containerName="dnsmasq-dns" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.750331 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.754919 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.755360 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.757173 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j"] Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.764072 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.764294 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.793201 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4779f4bd-7580-49e7-b536-ce3b8c77a8d4","Type":"ContainerStarted","Data":"0f769b2560cc34fe0f994dff060c1bbfbf27eb52375fea5ea7202ded084d357b"} Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.794251 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.798743 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab7e006f-0788-42e5-aee9-543e29514c09","Type":"ContainerStarted","Data":"3a00304b0c4a0afd4b4229cd5206a34489e8b87f9374cbdd89b43b2368ab75a7"} Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.799389 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.825401 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.825379172 podStartE2EDuration="36.825379172s" podCreationTimestamp="2026-02-25 16:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:17:05.815925355 +0000 UTC m=+1876.829317245" watchObservedRunningTime="2026-02-25 16:17:05.825379172 +0000 UTC m=+1876.838771062" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.841665 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.84164945 podStartE2EDuration="36.84164945s" podCreationTimestamp="2026-02-25 16:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:17:05.835423744 +0000 UTC m=+1876.848815634" watchObservedRunningTime="2026-02-25 16:17:05.84164945 +0000 UTC m=+1876.855041340" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.882669 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.882963 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.883713 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.883790 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzp78\" (UniqueName: \"kubernetes.io/projected/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-kube-api-access-pzp78\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.985468 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.985526 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzp78\" (UniqueName: \"kubernetes.io/projected/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-kube-api-access-pzp78\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.985599 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.985651 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.991248 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.991458 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:05 crc kubenswrapper[4937]: I0225 16:17:05.997354 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:06 crc kubenswrapper[4937]: I0225 16:17:06.004577 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzp78\" (UniqueName: \"kubernetes.io/projected/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-kube-api-access-pzp78\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:06 crc kubenswrapper[4937]: I0225 16:17:06.070631 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:06 crc kubenswrapper[4937]: I0225 16:17:06.735250 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j"] Feb 25 16:17:06 crc kubenswrapper[4937]: I0225 16:17:06.810013 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" event={"ID":"6ee081e9-3c3e-4bd7-9c7d-a4a917946879","Type":"ContainerStarted","Data":"1da30215fe3965b3a356771040bbfc0c23d743b2cada279a9944af40ec340100"} Feb 25 16:17:08 crc kubenswrapper[4937]: I0225 16:17:08.820811 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 25 16:17:17 crc kubenswrapper[4937]: I0225 16:17:17.367979 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:17:17 crc kubenswrapper[4937]: E0225 16:17:17.368866 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:17:17 crc kubenswrapper[4937]: I0225 16:17:17.971561 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" event={"ID":"6ee081e9-3c3e-4bd7-9c7d-a4a917946879","Type":"ContainerStarted","Data":"76b3fe36b185461e8dcc95f86004aee3d65f31f8e4803d4590042f0dc33f623f"} Feb 25 16:17:18 crc kubenswrapper[4937]: I0225 16:17:18.003344 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" podStartSLOduration=2.907497439 podStartE2EDuration="13.003322172s" podCreationTimestamp="2026-02-25 16:17:05 +0000 UTC" firstStartedPulling="2026-02-25 16:17:06.722682107 +0000 UTC m=+1877.736073997" lastFinishedPulling="2026-02-25 16:17:16.81850684 +0000 UTC m=+1887.831898730" observedRunningTime="2026-02-25 16:17:17.992670785 +0000 UTC m=+1889.006062725" watchObservedRunningTime="2026-02-25 16:17:18.003322172 +0000 UTC m=+1889.016714072" Feb 25 16:17:19 crc kubenswrapper[4937]: I0225 16:17:19.901643 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 25 16:17:19 crc kubenswrapper[4937]: I0225 16:17:19.909630 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 25 16:17:28 crc kubenswrapper[4937]: I0225 16:17:28.096775 4937 generic.go:334] "Generic (PLEG): container finished" podID="6ee081e9-3c3e-4bd7-9c7d-a4a917946879" containerID="76b3fe36b185461e8dcc95f86004aee3d65f31f8e4803d4590042f0dc33f623f" exitCode=0 Feb 25 16:17:28 crc kubenswrapper[4937]: I0225 16:17:28.096852 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" event={"ID":"6ee081e9-3c3e-4bd7-9c7d-a4a917946879","Type":"ContainerDied","Data":"76b3fe36b185461e8dcc95f86004aee3d65f31f8e4803d4590042f0dc33f623f"} Feb 25 16:17:28 crc kubenswrapper[4937]: I0225 16:17:28.368444 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:17:28 crc kubenswrapper[4937]: E0225 16:17:28.368998 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:17:29 crc kubenswrapper[4937]: I0225 16:17:29.599259 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:29 crc kubenswrapper[4937]: I0225 16:17:29.724580 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-repo-setup-combined-ca-bundle\") pod \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " Feb 25 16:17:29 crc kubenswrapper[4937]: I0225 16:17:29.724727 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzp78\" (UniqueName: \"kubernetes.io/projected/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-kube-api-access-pzp78\") pod \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " Feb 25 16:17:29 crc kubenswrapper[4937]: I0225 16:17:29.724780 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-ssh-key-openstack-edpm-ipam\") pod \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " Feb 25 16:17:29 crc kubenswrapper[4937]: I0225 16:17:29.724919 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-inventory\") pod \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\" (UID: \"6ee081e9-3c3e-4bd7-9c7d-a4a917946879\") " Feb 25 16:17:29 crc kubenswrapper[4937]: I0225 16:17:29.730626 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6ee081e9-3c3e-4bd7-9c7d-a4a917946879" (UID: "6ee081e9-3c3e-4bd7-9c7d-a4a917946879"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:17:29 crc kubenswrapper[4937]: I0225 16:17:29.735020 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-kube-api-access-pzp78" (OuterVolumeSpecName: "kube-api-access-pzp78") pod "6ee081e9-3c3e-4bd7-9c7d-a4a917946879" (UID: "6ee081e9-3c3e-4bd7-9c7d-a4a917946879"). InnerVolumeSpecName "kube-api-access-pzp78". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:17:29 crc kubenswrapper[4937]: I0225 16:17:29.760630 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-inventory" (OuterVolumeSpecName: "inventory") pod "6ee081e9-3c3e-4bd7-9c7d-a4a917946879" (UID: "6ee081e9-3c3e-4bd7-9c7d-a4a917946879"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:17:29 crc kubenswrapper[4937]: I0225 16:17:29.780893 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6ee081e9-3c3e-4bd7-9c7d-a4a917946879" (UID: "6ee081e9-3c3e-4bd7-9c7d-a4a917946879"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:17:29 crc kubenswrapper[4937]: I0225 16:17:29.827977 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzp78\" (UniqueName: \"kubernetes.io/projected/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-kube-api-access-pzp78\") on node \"crc\" DevicePath \"\"" Feb 25 16:17:29 crc kubenswrapper[4937]: I0225 16:17:29.828021 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:17:29 crc kubenswrapper[4937]: I0225 16:17:29.828037 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:17:29 crc kubenswrapper[4937]: I0225 16:17:29.828049 4937 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ee081e9-3c3e-4bd7-9c7d-a4a917946879-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.121613 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" event={"ID":"6ee081e9-3c3e-4bd7-9c7d-a4a917946879","Type":"ContainerDied","Data":"1da30215fe3965b3a356771040bbfc0c23d743b2cada279a9944af40ec340100"} Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.121661 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1da30215fe3965b3a356771040bbfc0c23d743b2cada279a9944af40ec340100" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.121689 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.208384 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5"] Feb 25 16:17:30 crc kubenswrapper[4937]: E0225 16:17:30.208824 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee081e9-3c3e-4bd7-9c7d-a4a917946879" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.208841 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee081e9-3c3e-4bd7-9c7d-a4a917946879" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.209020 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee081e9-3c3e-4bd7-9c7d-a4a917946879" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.210066 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.212507 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.212597 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.212757 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.214411 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.223253 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5"] Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.338755 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cb9799d-3115-4657-a7f3-18fbcb14a073-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-78hr5\" (UID: \"9cb9799d-3115-4657-a7f3-18fbcb14a073\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.338816 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tdc8\" (UniqueName: \"kubernetes.io/projected/9cb9799d-3115-4657-a7f3-18fbcb14a073-kube-api-access-8tdc8\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-78hr5\" (UID: \"9cb9799d-3115-4657-a7f3-18fbcb14a073\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.338950 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cb9799d-3115-4657-a7f3-18fbcb14a073-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-78hr5\" (UID: \"9cb9799d-3115-4657-a7f3-18fbcb14a073\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.440591 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cb9799d-3115-4657-a7f3-18fbcb14a073-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-78hr5\" (UID: \"9cb9799d-3115-4657-a7f3-18fbcb14a073\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.440673 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tdc8\" (UniqueName: \"kubernetes.io/projected/9cb9799d-3115-4657-a7f3-18fbcb14a073-kube-api-access-8tdc8\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-78hr5\" (UID: \"9cb9799d-3115-4657-a7f3-18fbcb14a073\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.440912 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cb9799d-3115-4657-a7f3-18fbcb14a073-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-78hr5\" (UID: \"9cb9799d-3115-4657-a7f3-18fbcb14a073\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.448207 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cb9799d-3115-4657-a7f3-18fbcb14a073-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-78hr5\" (UID: \"9cb9799d-3115-4657-a7f3-18fbcb14a073\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.455240 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cb9799d-3115-4657-a7f3-18fbcb14a073-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-78hr5\" (UID: \"9cb9799d-3115-4657-a7f3-18fbcb14a073\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.480289 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tdc8\" (UniqueName: \"kubernetes.io/projected/9cb9799d-3115-4657-a7f3-18fbcb14a073-kube-api-access-8tdc8\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-78hr5\" (UID: \"9cb9799d-3115-4657-a7f3-18fbcb14a073\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" Feb 25 16:17:30 crc kubenswrapper[4937]: I0225 16:17:30.525678 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" Feb 25 16:17:31 crc kubenswrapper[4937]: I0225 16:17:31.061458 4937 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 16:17:31 crc kubenswrapper[4937]: I0225 16:17:31.063180 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5"] Feb 25 16:17:31 crc kubenswrapper[4937]: I0225 16:17:31.138930 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" event={"ID":"9cb9799d-3115-4657-a7f3-18fbcb14a073","Type":"ContainerStarted","Data":"e979bca8d7dd33d161dd4354bd605e867a3f0c835bc349cb3de3583b1edaad2b"} Feb 25 16:17:32 crc kubenswrapper[4937]: I0225 16:17:32.148895 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" event={"ID":"9cb9799d-3115-4657-a7f3-18fbcb14a073","Type":"ContainerStarted","Data":"e0ba2b05622d76543d8e1600f56b4f33c0bca84cb6769c91ae89b0c799da4ef3"} Feb 25 16:17:32 crc kubenswrapper[4937]: I0225 16:17:32.173440 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" podStartSLOduration=1.7328136060000001 podStartE2EDuration="2.173418002s" podCreationTimestamp="2026-02-25 16:17:30 +0000 UTC" firstStartedPulling="2026-02-25 16:17:31.060851581 +0000 UTC m=+1902.074243521" lastFinishedPulling="2026-02-25 16:17:31.501456007 +0000 UTC m=+1902.514847917" observedRunningTime="2026-02-25 16:17:32.165424871 +0000 UTC m=+1903.178816761" watchObservedRunningTime="2026-02-25 16:17:32.173418002 +0000 UTC m=+1903.186809892" Feb 25 16:17:34 crc kubenswrapper[4937]: I0225 16:17:34.168461 4937 generic.go:334] "Generic (PLEG): container finished" podID="9cb9799d-3115-4657-a7f3-18fbcb14a073" containerID="e0ba2b05622d76543d8e1600f56b4f33c0bca84cb6769c91ae89b0c799da4ef3" exitCode=0 Feb 25 16:17:34 crc kubenswrapper[4937]: I0225 16:17:34.168538 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" event={"ID":"9cb9799d-3115-4657-a7f3-18fbcb14a073","Type":"ContainerDied","Data":"e0ba2b05622d76543d8e1600f56b4f33c0bca84cb6769c91ae89b0c799da4ef3"} Feb 25 16:17:35 crc kubenswrapper[4937]: I0225 16:17:35.720738 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" Feb 25 16:17:35 crc kubenswrapper[4937]: I0225 16:17:35.857380 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cb9799d-3115-4657-a7f3-18fbcb14a073-ssh-key-openstack-edpm-ipam\") pod \"9cb9799d-3115-4657-a7f3-18fbcb14a073\" (UID: \"9cb9799d-3115-4657-a7f3-18fbcb14a073\") " Feb 25 16:17:35 crc kubenswrapper[4937]: I0225 16:17:35.857468 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cb9799d-3115-4657-a7f3-18fbcb14a073-inventory\") pod \"9cb9799d-3115-4657-a7f3-18fbcb14a073\" (UID: \"9cb9799d-3115-4657-a7f3-18fbcb14a073\") " Feb 25 16:17:35 crc kubenswrapper[4937]: I0225 16:17:35.857657 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdc8\" (UniqueName: \"kubernetes.io/projected/9cb9799d-3115-4657-a7f3-18fbcb14a073-kube-api-access-8tdc8\") pod \"9cb9799d-3115-4657-a7f3-18fbcb14a073\" (UID: \"9cb9799d-3115-4657-a7f3-18fbcb14a073\") " Feb 25 16:17:35 crc kubenswrapper[4937]: I0225 16:17:35.862994 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cb9799d-3115-4657-a7f3-18fbcb14a073-kube-api-access-8tdc8" (OuterVolumeSpecName: "kube-api-access-8tdc8") pod "9cb9799d-3115-4657-a7f3-18fbcb14a073" (UID: "9cb9799d-3115-4657-a7f3-18fbcb14a073"). InnerVolumeSpecName "kube-api-access-8tdc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:17:35 crc kubenswrapper[4937]: I0225 16:17:35.888904 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb9799d-3115-4657-a7f3-18fbcb14a073-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9cb9799d-3115-4657-a7f3-18fbcb14a073" (UID: "9cb9799d-3115-4657-a7f3-18fbcb14a073"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:17:35 crc kubenswrapper[4937]: I0225 16:17:35.891008 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb9799d-3115-4657-a7f3-18fbcb14a073-inventory" (OuterVolumeSpecName: "inventory") pod "9cb9799d-3115-4657-a7f3-18fbcb14a073" (UID: "9cb9799d-3115-4657-a7f3-18fbcb14a073"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:17:35 crc kubenswrapper[4937]: I0225 16:17:35.959844 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cb9799d-3115-4657-a7f3-18fbcb14a073-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:17:35 crc kubenswrapper[4937]: I0225 16:17:35.959872 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdc8\" (UniqueName: \"kubernetes.io/projected/9cb9799d-3115-4657-a7f3-18fbcb14a073-kube-api-access-8tdc8\") on node \"crc\" DevicePath \"\"" Feb 25 16:17:35 crc kubenswrapper[4937]: I0225 16:17:35.959884 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cb9799d-3115-4657-a7f3-18fbcb14a073-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.196218 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" event={"ID":"9cb9799d-3115-4657-a7f3-18fbcb14a073","Type":"ContainerDied","Data":"e979bca8d7dd33d161dd4354bd605e867a3f0c835bc349cb3de3583b1edaad2b"} Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.196251 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-78hr5" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.196253 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e979bca8d7dd33d161dd4354bd605e867a3f0c835bc349cb3de3583b1edaad2b" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.257640 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc"] Feb 25 16:17:36 crc kubenswrapper[4937]: E0225 16:17:36.258085 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb9799d-3115-4657-a7f3-18fbcb14a073" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.258106 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb9799d-3115-4657-a7f3-18fbcb14a073" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.258314 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb9799d-3115-4657-a7f3-18fbcb14a073" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.259252 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.261325 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.261641 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.261870 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.262041 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.271126 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc"] Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.367519 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.367650 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdnwr\" (UniqueName: \"kubernetes.io/projected/165310e1-208b-4a29-a8fd-be630d60fc08-kube-api-access-rdnwr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.367715 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.367751 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.470793 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.470856 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.471044 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.471106 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdnwr\" (UniqueName: \"kubernetes.io/projected/165310e1-208b-4a29-a8fd-be630d60fc08-kube-api-access-rdnwr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.477231 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.478839 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.487025 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.490940 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdnwr\" (UniqueName: \"kubernetes.io/projected/165310e1-208b-4a29-a8fd-be630d60fc08-kube-api-access-rdnwr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:17:36 crc kubenswrapper[4937]: I0225 16:17:36.603958 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:17:37 crc kubenswrapper[4937]: W0225 16:17:37.160334 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod165310e1_208b_4a29_a8fd_be630d60fc08.slice/crio-eaf67f8055e2ff7523d6c8b943dc2ed17d4338ac50b944a29ec0b0bd81649536 WatchSource:0}: Error finding container eaf67f8055e2ff7523d6c8b943dc2ed17d4338ac50b944a29ec0b0bd81649536: Status 404 returned error can't find the container with id eaf67f8055e2ff7523d6c8b943dc2ed17d4338ac50b944a29ec0b0bd81649536 Feb 25 16:17:37 crc kubenswrapper[4937]: I0225 16:17:37.166925 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc"] Feb 25 16:17:37 crc kubenswrapper[4937]: I0225 16:17:37.207003 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" event={"ID":"165310e1-208b-4a29-a8fd-be630d60fc08","Type":"ContainerStarted","Data":"eaf67f8055e2ff7523d6c8b943dc2ed17d4338ac50b944a29ec0b0bd81649536"} Feb 25 16:17:37 crc kubenswrapper[4937]: I0225 16:17:37.250074 4937 scope.go:117] "RemoveContainer" containerID="0245edefc1fec8749d4735f0ca12e19c045029c5a10ad5e50e21bb8a1e76d09d" Feb 25 16:17:37 crc kubenswrapper[4937]: I0225 16:17:37.289851 4937 scope.go:117] "RemoveContainer" containerID="d7ccbc37d2e9e7845e22afd673f137c9bda281f4362f748a265d8c6f692ae8c8" Feb 25 16:17:37 crc kubenswrapper[4937]: I0225 16:17:37.342083 4937 scope.go:117] "RemoveContainer" containerID="29e6688b03fc9b75617fe3e732dcc1252c3dcb1e14b062bb630fe55ccd5b9d82" Feb 25 16:17:37 crc kubenswrapper[4937]: I0225 16:17:37.395372 4937 scope.go:117] "RemoveContainer" containerID="4757fbd259c99a2f9f1412b84438fd9fade0f8f5a039a702d6f0c5dac11e2cc9" Feb 25 16:17:38 crc kubenswrapper[4937]: I0225 16:17:38.220291 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" event={"ID":"165310e1-208b-4a29-a8fd-be630d60fc08","Type":"ContainerStarted","Data":"2e4023dd73822cf45e9756b0b9ee873c654101eae60c672e5da5b01097ea3f19"} Feb 25 16:17:38 crc kubenswrapper[4937]: I0225 16:17:38.244770 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" podStartSLOduration=1.823595686 podStartE2EDuration="2.244731534s" podCreationTimestamp="2026-02-25 16:17:36 +0000 UTC" firstStartedPulling="2026-02-25 16:17:37.162416711 +0000 UTC m=+1908.175808601" lastFinishedPulling="2026-02-25 16:17:37.583552559 +0000 UTC m=+1908.596944449" observedRunningTime="2026-02-25 16:17:38.237451501 +0000 UTC m=+1909.250843391" watchObservedRunningTime="2026-02-25 16:17:38.244731534 +0000 UTC m=+1909.258123464" Feb 25 16:17:41 crc kubenswrapper[4937]: I0225 16:17:41.376858 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:17:41 crc kubenswrapper[4937]: E0225 16:17:41.377673 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:17:52 crc kubenswrapper[4937]: I0225 16:17:52.367939 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:17:52 crc kubenswrapper[4937]: E0225 16:17:52.370306 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:18:00 crc kubenswrapper[4937]: I0225 16:18:00.152880 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533938-bhrf7"] Feb 25 16:18:00 crc kubenswrapper[4937]: I0225 16:18:00.155128 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533938-bhrf7" Feb 25 16:18:00 crc kubenswrapper[4937]: I0225 16:18:00.157058 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:18:00 crc kubenswrapper[4937]: I0225 16:18:00.158235 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:18:00 crc kubenswrapper[4937]: I0225 16:18:00.159052 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:18:00 crc kubenswrapper[4937]: I0225 16:18:00.162099 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533938-bhrf7"] Feb 25 16:18:00 crc kubenswrapper[4937]: I0225 16:18:00.190808 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnc24\" (UniqueName: \"kubernetes.io/projected/947fb65f-cfe3-411c-9ebd-4f89480703e0-kube-api-access-cnc24\") pod \"auto-csr-approver-29533938-bhrf7\" (UID: \"947fb65f-cfe3-411c-9ebd-4f89480703e0\") " pod="openshift-infra/auto-csr-approver-29533938-bhrf7" Feb 25 16:18:00 crc kubenswrapper[4937]: I0225 16:18:00.292382 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnc24\" (UniqueName: \"kubernetes.io/projected/947fb65f-cfe3-411c-9ebd-4f89480703e0-kube-api-access-cnc24\") pod \"auto-csr-approver-29533938-bhrf7\" (UID: \"947fb65f-cfe3-411c-9ebd-4f89480703e0\") " pod="openshift-infra/auto-csr-approver-29533938-bhrf7" Feb 25 16:18:00 crc kubenswrapper[4937]: I0225 16:18:00.314819 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnc24\" (UniqueName: \"kubernetes.io/projected/947fb65f-cfe3-411c-9ebd-4f89480703e0-kube-api-access-cnc24\") pod \"auto-csr-approver-29533938-bhrf7\" (UID: \"947fb65f-cfe3-411c-9ebd-4f89480703e0\") " pod="openshift-infra/auto-csr-approver-29533938-bhrf7" Feb 25 16:18:00 crc kubenswrapper[4937]: I0225 16:18:00.478627 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533938-bhrf7" Feb 25 16:18:00 crc kubenswrapper[4937]: I0225 16:18:00.969880 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533938-bhrf7"] Feb 25 16:18:01 crc kubenswrapper[4937]: I0225 16:18:01.535992 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533938-bhrf7" event={"ID":"947fb65f-cfe3-411c-9ebd-4f89480703e0","Type":"ContainerStarted","Data":"28b20b5d303eee9f70130c45ab6158fe7972079e8eb4288bb535d3333cc53d57"} Feb 25 16:18:03 crc kubenswrapper[4937]: I0225 16:18:03.585514 4937 generic.go:334] "Generic (PLEG): container finished" podID="947fb65f-cfe3-411c-9ebd-4f89480703e0" containerID="4e633c736d9b0cd964a0d626c3ae05a3f211a5d2b78c41a8d01cc2d9ca52193b" exitCode=0 Feb 25 16:18:03 crc kubenswrapper[4937]: I0225 16:18:03.585575 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533938-bhrf7" event={"ID":"947fb65f-cfe3-411c-9ebd-4f89480703e0","Type":"ContainerDied","Data":"4e633c736d9b0cd964a0d626c3ae05a3f211a5d2b78c41a8d01cc2d9ca52193b"} Feb 25 16:18:05 crc kubenswrapper[4937]: I0225 16:18:05.012258 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533938-bhrf7" Feb 25 16:18:05 crc kubenswrapper[4937]: I0225 16:18:05.116258 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnc24\" (UniqueName: \"kubernetes.io/projected/947fb65f-cfe3-411c-9ebd-4f89480703e0-kube-api-access-cnc24\") pod \"947fb65f-cfe3-411c-9ebd-4f89480703e0\" (UID: \"947fb65f-cfe3-411c-9ebd-4f89480703e0\") " Feb 25 16:18:05 crc kubenswrapper[4937]: I0225 16:18:05.121684 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/947fb65f-cfe3-411c-9ebd-4f89480703e0-kube-api-access-cnc24" (OuterVolumeSpecName: "kube-api-access-cnc24") pod "947fb65f-cfe3-411c-9ebd-4f89480703e0" (UID: "947fb65f-cfe3-411c-9ebd-4f89480703e0"). InnerVolumeSpecName "kube-api-access-cnc24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:18:05 crc kubenswrapper[4937]: I0225 16:18:05.219736 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnc24\" (UniqueName: \"kubernetes.io/projected/947fb65f-cfe3-411c-9ebd-4f89480703e0-kube-api-access-cnc24\") on node \"crc\" DevicePath \"\"" Feb 25 16:18:05 crc kubenswrapper[4937]: I0225 16:18:05.630201 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533938-bhrf7" event={"ID":"947fb65f-cfe3-411c-9ebd-4f89480703e0","Type":"ContainerDied","Data":"28b20b5d303eee9f70130c45ab6158fe7972079e8eb4288bb535d3333cc53d57"} Feb 25 16:18:05 crc kubenswrapper[4937]: I0225 16:18:05.630240 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28b20b5d303eee9f70130c45ab6158fe7972079e8eb4288bb535d3333cc53d57" Feb 25 16:18:05 crc kubenswrapper[4937]: I0225 16:18:05.630249 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533938-bhrf7" Feb 25 16:18:06 crc kubenswrapper[4937]: I0225 16:18:06.103673 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533932-w2rgh"] Feb 25 16:18:06 crc kubenswrapper[4937]: I0225 16:18:06.114716 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533932-w2rgh"] Feb 25 16:18:07 crc kubenswrapper[4937]: I0225 16:18:07.368251 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:18:07 crc kubenswrapper[4937]: E0225 16:18:07.369104 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:18:07 crc kubenswrapper[4937]: I0225 16:18:07.381671 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa840e84-cc33-4fed-9f62-50fc286001be" path="/var/lib/kubelet/pods/fa840e84-cc33-4fed-9f62-50fc286001be/volumes" Feb 25 16:18:22 crc kubenswrapper[4937]: I0225 16:18:22.368435 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:18:22 crc kubenswrapper[4937]: E0225 16:18:22.369347 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:18:37 crc kubenswrapper[4937]: I0225 16:18:37.368288 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:18:37 crc kubenswrapper[4937]: E0225 16:18:37.369079 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:18:37 crc kubenswrapper[4937]: I0225 16:18:37.567236 4937 scope.go:117] "RemoveContainer" containerID="81db60abb1ddaad73fa876838afd22b15acb3e798953036b822338fc96b2d90e" Feb 25 16:18:37 crc kubenswrapper[4937]: I0225 16:18:37.589575 4937 scope.go:117] "RemoveContainer" containerID="ae00e1d43fdb53cf7f2889ee0de867d00e6d545940c114f95c7fc3b3bcbcd53e" Feb 25 16:18:37 crc kubenswrapper[4937]: I0225 16:18:37.619040 4937 scope.go:117] "RemoveContainer" containerID="b5892b819d2cbe4c4048f7f1bc8d670c5a45525ba43ae5a70eccd31d9c0df8aa" Feb 25 16:18:37 crc kubenswrapper[4937]: I0225 16:18:37.637502 4937 scope.go:117] "RemoveContainer" containerID="9c4c47f7ad325d62685685ac806ca0d0f52131c5f2fb11a44098758d49c54951" Feb 25 16:18:37 crc kubenswrapper[4937]: I0225 16:18:37.661025 4937 scope.go:117] "RemoveContainer" containerID="62353a78138d2688a31e712416ee99ca54669f43d67873016bea95f2f68021f6" Feb 25 16:18:37 crc kubenswrapper[4937]: I0225 16:18:37.699989 4937 scope.go:117] "RemoveContainer" containerID="3e1c5775ace7d313d3863739ee7d2d84dd18e635104cd379c0af96fca2dab63e" Feb 25 16:18:37 crc kubenswrapper[4937]: I0225 16:18:37.740786 4937 scope.go:117] "RemoveContainer" containerID="b6da8204356ff39a00ea99d54dc40b9b8636f0cc59c85f0911e2068123d6ba93" Feb 25 16:18:37 crc kubenswrapper[4937]: I0225 16:18:37.793073 4937 scope.go:117] "RemoveContainer" containerID="76045b66ae396f1673da50d35a9cdea3cc02e17bb8a1daaf6a0d60d35c0e65e4" Feb 25 16:18:51 crc kubenswrapper[4937]: I0225 16:18:51.376717 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:18:52 crc kubenswrapper[4937]: I0225 16:18:52.155375 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"a1902f50a856ad1bac31a5abd173e354d882341ff395f33b062b6ec2ed08e38e"} Feb 25 16:19:37 crc kubenswrapper[4937]: I0225 16:19:37.940564 4937 scope.go:117] "RemoveContainer" containerID="83fa1dba06b54c83091f7f6bfeac3df28e8fb5313d3de3b14bfb9ee33cfda3d4" Feb 25 16:19:37 crc kubenswrapper[4937]: I0225 16:19:37.993027 4937 scope.go:117] "RemoveContainer" containerID="c425910de0a5e69eaff03516c0ca9937a3f17e41f9618d78a391bfc1c88e8e11" Feb 25 16:20:00 crc kubenswrapper[4937]: I0225 16:20:00.163429 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533940-84f7g"] Feb 25 16:20:00 crc kubenswrapper[4937]: E0225 16:20:00.164817 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947fb65f-cfe3-411c-9ebd-4f89480703e0" containerName="oc" Feb 25 16:20:00 crc kubenswrapper[4937]: I0225 16:20:00.164839 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="947fb65f-cfe3-411c-9ebd-4f89480703e0" containerName="oc" Feb 25 16:20:00 crc kubenswrapper[4937]: I0225 16:20:00.165228 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="947fb65f-cfe3-411c-9ebd-4f89480703e0" containerName="oc" Feb 25 16:20:00 crc kubenswrapper[4937]: I0225 16:20:00.166560 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533940-84f7g" Feb 25 16:20:00 crc kubenswrapper[4937]: I0225 16:20:00.170580 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:20:00 crc kubenswrapper[4937]: I0225 16:20:00.174200 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:20:00 crc kubenswrapper[4937]: I0225 16:20:00.174211 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:20:00 crc kubenswrapper[4937]: I0225 16:20:00.179090 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533940-84f7g"] Feb 25 16:20:00 crc kubenswrapper[4937]: I0225 16:20:00.268918 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb7j5\" (UniqueName: \"kubernetes.io/projected/e7c4e56b-a36f-48a7-b4d7-b962a074d740-kube-api-access-fb7j5\") pod \"auto-csr-approver-29533940-84f7g\" (UID: \"e7c4e56b-a36f-48a7-b4d7-b962a074d740\") " pod="openshift-infra/auto-csr-approver-29533940-84f7g" Feb 25 16:20:00 crc kubenswrapper[4937]: I0225 16:20:00.371083 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb7j5\" (UniqueName: \"kubernetes.io/projected/e7c4e56b-a36f-48a7-b4d7-b962a074d740-kube-api-access-fb7j5\") pod \"auto-csr-approver-29533940-84f7g\" (UID: \"e7c4e56b-a36f-48a7-b4d7-b962a074d740\") " pod="openshift-infra/auto-csr-approver-29533940-84f7g" Feb 25 16:20:00 crc kubenswrapper[4937]: I0225 16:20:00.395435 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb7j5\" (UniqueName: \"kubernetes.io/projected/e7c4e56b-a36f-48a7-b4d7-b962a074d740-kube-api-access-fb7j5\") pod \"auto-csr-approver-29533940-84f7g\" (UID: \"e7c4e56b-a36f-48a7-b4d7-b962a074d740\") " pod="openshift-infra/auto-csr-approver-29533940-84f7g" Feb 25 16:20:00 crc kubenswrapper[4937]: I0225 16:20:00.516414 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533940-84f7g" Feb 25 16:20:00 crc kubenswrapper[4937]: I0225 16:20:00.995383 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533940-84f7g"] Feb 25 16:20:01 crc kubenswrapper[4937]: I0225 16:20:01.987143 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533940-84f7g" event={"ID":"e7c4e56b-a36f-48a7-b4d7-b962a074d740","Type":"ContainerStarted","Data":"5ad95c6469de20bbc4d28d15fa2fc1aa56f4d960b62b76763db42d9880f9f21f"} Feb 25 16:20:02 crc kubenswrapper[4937]: I0225 16:20:02.998275 4937 generic.go:334] "Generic (PLEG): container finished" podID="e7c4e56b-a36f-48a7-b4d7-b962a074d740" containerID="a4a7497aa724f8c0c1e182934a3c79d21817bf8baff829875b9f1ddf561476e7" exitCode=0 Feb 25 16:20:02 crc kubenswrapper[4937]: I0225 16:20:02.998333 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533940-84f7g" event={"ID":"e7c4e56b-a36f-48a7-b4d7-b962a074d740","Type":"ContainerDied","Data":"a4a7497aa724f8c0c1e182934a3c79d21817bf8baff829875b9f1ddf561476e7"} Feb 25 16:20:04 crc kubenswrapper[4937]: I0225 16:20:04.445659 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533940-84f7g" Feb 25 16:20:04 crc kubenswrapper[4937]: I0225 16:20:04.559217 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb7j5\" (UniqueName: \"kubernetes.io/projected/e7c4e56b-a36f-48a7-b4d7-b962a074d740-kube-api-access-fb7j5\") pod \"e7c4e56b-a36f-48a7-b4d7-b962a074d740\" (UID: \"e7c4e56b-a36f-48a7-b4d7-b962a074d740\") " Feb 25 16:20:04 crc kubenswrapper[4937]: I0225 16:20:04.568715 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c4e56b-a36f-48a7-b4d7-b962a074d740-kube-api-access-fb7j5" (OuterVolumeSpecName: "kube-api-access-fb7j5") pod "e7c4e56b-a36f-48a7-b4d7-b962a074d740" (UID: "e7c4e56b-a36f-48a7-b4d7-b962a074d740"). InnerVolumeSpecName "kube-api-access-fb7j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:20:04 crc kubenswrapper[4937]: I0225 16:20:04.663060 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb7j5\" (UniqueName: \"kubernetes.io/projected/e7c4e56b-a36f-48a7-b4d7-b962a074d740-kube-api-access-fb7j5\") on node \"crc\" DevicePath \"\"" Feb 25 16:20:05 crc kubenswrapper[4937]: I0225 16:20:05.029708 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533940-84f7g" event={"ID":"e7c4e56b-a36f-48a7-b4d7-b962a074d740","Type":"ContainerDied","Data":"5ad95c6469de20bbc4d28d15fa2fc1aa56f4d960b62b76763db42d9880f9f21f"} Feb 25 16:20:05 crc kubenswrapper[4937]: I0225 16:20:05.029752 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ad95c6469de20bbc4d28d15fa2fc1aa56f4d960b62b76763db42d9880f9f21f" Feb 25 16:20:05 crc kubenswrapper[4937]: I0225 16:20:05.029765 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533940-84f7g" Feb 25 16:20:05 crc kubenswrapper[4937]: I0225 16:20:05.527320 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533934-9znz5"] Feb 25 16:20:05 crc kubenswrapper[4937]: I0225 16:20:05.540979 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533934-9znz5"] Feb 25 16:20:07 crc kubenswrapper[4937]: I0225 16:20:07.380411 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b10bad36-94f5-475c-9e2d-c99fcac85f6a" path="/var/lib/kubelet/pods/b10bad36-94f5-475c-9e2d-c99fcac85f6a/volumes" Feb 25 16:20:33 crc kubenswrapper[4937]: I0225 16:20:33.358861 4937 generic.go:334] "Generic (PLEG): container finished" podID="165310e1-208b-4a29-a8fd-be630d60fc08" containerID="2e4023dd73822cf45e9756b0b9ee873c654101eae60c672e5da5b01097ea3f19" exitCode=0 Feb 25 16:20:33 crc kubenswrapper[4937]: I0225 16:20:33.358934 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" event={"ID":"165310e1-208b-4a29-a8fd-be630d60fc08","Type":"ContainerDied","Data":"2e4023dd73822cf45e9756b0b9ee873c654101eae60c672e5da5b01097ea3f19"} Feb 25 16:20:34 crc kubenswrapper[4937]: I0225 16:20:34.880370 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.010916 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-ssh-key-openstack-edpm-ipam\") pod \"165310e1-208b-4a29-a8fd-be630d60fc08\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.011080 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-inventory\") pod \"165310e1-208b-4a29-a8fd-be630d60fc08\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.011198 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdnwr\" (UniqueName: \"kubernetes.io/projected/165310e1-208b-4a29-a8fd-be630d60fc08-kube-api-access-rdnwr\") pod \"165310e1-208b-4a29-a8fd-be630d60fc08\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.011360 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-bootstrap-combined-ca-bundle\") pod \"165310e1-208b-4a29-a8fd-be630d60fc08\" (UID: \"165310e1-208b-4a29-a8fd-be630d60fc08\") " Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.017089 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/165310e1-208b-4a29-a8fd-be630d60fc08-kube-api-access-rdnwr" (OuterVolumeSpecName: "kube-api-access-rdnwr") pod "165310e1-208b-4a29-a8fd-be630d60fc08" (UID: "165310e1-208b-4a29-a8fd-be630d60fc08"). InnerVolumeSpecName "kube-api-access-rdnwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.019256 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "165310e1-208b-4a29-a8fd-be630d60fc08" (UID: "165310e1-208b-4a29-a8fd-be630d60fc08"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.055872 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-inventory" (OuterVolumeSpecName: "inventory") pod "165310e1-208b-4a29-a8fd-be630d60fc08" (UID: "165310e1-208b-4a29-a8fd-be630d60fc08"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.073556 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "165310e1-208b-4a29-a8fd-be630d60fc08" (UID: "165310e1-208b-4a29-a8fd-be630d60fc08"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.115534 4937 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.115947 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.115968 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/165310e1-208b-4a29-a8fd-be630d60fc08-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.116028 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdnwr\" (UniqueName: \"kubernetes.io/projected/165310e1-208b-4a29-a8fd-be630d60fc08-kube-api-access-rdnwr\") on node \"crc\" DevicePath \"\"" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.386936 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.398954 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc" event={"ID":"165310e1-208b-4a29-a8fd-be630d60fc08","Type":"ContainerDied","Data":"eaf67f8055e2ff7523d6c8b943dc2ed17d4338ac50b944a29ec0b0bd81649536"} Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.399016 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaf67f8055e2ff7523d6c8b943dc2ed17d4338ac50b944a29ec0b0bd81649536" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.516242 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w"] Feb 25 16:20:35 crc kubenswrapper[4937]: E0225 16:20:35.521171 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c4e56b-a36f-48a7-b4d7-b962a074d740" containerName="oc" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.521216 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c4e56b-a36f-48a7-b4d7-b962a074d740" containerName="oc" Feb 25 16:20:35 crc kubenswrapper[4937]: E0225 16:20:35.521298 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="165310e1-208b-4a29-a8fd-be630d60fc08" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.521312 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="165310e1-208b-4a29-a8fd-be630d60fc08" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.521697 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c4e56b-a36f-48a7-b4d7-b962a074d740" containerName="oc" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.521732 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="165310e1-208b-4a29-a8fd-be630d60fc08" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.522940 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.527084 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w"] Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.528350 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.528558 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.528656 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.528833 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.651239 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfpb2\" (UniqueName: \"kubernetes.io/projected/e34d42d5-94de-45fe-b002-65da3cd1d49d-kube-api-access-pfpb2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-29n8w\" (UID: \"e34d42d5-94de-45fe-b002-65da3cd1d49d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.651524 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e34d42d5-94de-45fe-b002-65da3cd1d49d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-29n8w\" (UID: \"e34d42d5-94de-45fe-b002-65da3cd1d49d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.651600 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e34d42d5-94de-45fe-b002-65da3cd1d49d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-29n8w\" (UID: \"e34d42d5-94de-45fe-b002-65da3cd1d49d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.753381 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e34d42d5-94de-45fe-b002-65da3cd1d49d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-29n8w\" (UID: \"e34d42d5-94de-45fe-b002-65da3cd1d49d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.753472 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e34d42d5-94de-45fe-b002-65da3cd1d49d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-29n8w\" (UID: \"e34d42d5-94de-45fe-b002-65da3cd1d49d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.753545 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfpb2\" (UniqueName: \"kubernetes.io/projected/e34d42d5-94de-45fe-b002-65da3cd1d49d-kube-api-access-pfpb2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-29n8w\" (UID: \"e34d42d5-94de-45fe-b002-65da3cd1d49d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.771252 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e34d42d5-94de-45fe-b002-65da3cd1d49d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-29n8w\" (UID: \"e34d42d5-94de-45fe-b002-65da3cd1d49d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.773306 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e34d42d5-94de-45fe-b002-65da3cd1d49d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-29n8w\" (UID: \"e34d42d5-94de-45fe-b002-65da3cd1d49d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.774035 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfpb2\" (UniqueName: \"kubernetes.io/projected/e34d42d5-94de-45fe-b002-65da3cd1d49d-kube-api-access-pfpb2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-29n8w\" (UID: \"e34d42d5-94de-45fe-b002-65da3cd1d49d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" Feb 25 16:20:35 crc kubenswrapper[4937]: I0225 16:20:35.845736 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" Feb 25 16:20:36 crc kubenswrapper[4937]: I0225 16:20:36.385554 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w"] Feb 25 16:20:37 crc kubenswrapper[4937]: I0225 16:20:37.414984 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" event={"ID":"e34d42d5-94de-45fe-b002-65da3cd1d49d","Type":"ContainerStarted","Data":"a6ae96af4fca6c5b8c6fd5329143b98858202373717cfb774b9c89184a5d9aa2"} Feb 25 16:20:37 crc kubenswrapper[4937]: I0225 16:20:37.415755 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" event={"ID":"e34d42d5-94de-45fe-b002-65da3cd1d49d","Type":"ContainerStarted","Data":"bac7a84e7f18baf9972e26b861f03aef7b3c1f1ce001b98e4df4ae5cddd73ffd"} Feb 25 16:20:37 crc kubenswrapper[4937]: I0225 16:20:37.435071 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" podStartSLOduration=1.976551612 podStartE2EDuration="2.435050256s" podCreationTimestamp="2026-02-25 16:20:35 +0000 UTC" firstStartedPulling="2026-02-25 16:20:36.394113005 +0000 UTC m=+2087.407504895" lastFinishedPulling="2026-02-25 16:20:36.852611609 +0000 UTC m=+2087.866003539" observedRunningTime="2026-02-25 16:20:37.43444004 +0000 UTC m=+2088.447831970" watchObservedRunningTime="2026-02-25 16:20:37.435050256 +0000 UTC m=+2088.448442156" Feb 25 16:20:38 crc kubenswrapper[4937]: I0225 16:20:38.116778 4937 scope.go:117] "RemoveContainer" containerID="bddb6db6b71410649176b95603e990a6c0ec09b56b153e0fe57e3a53f5a7b3ef" Feb 25 16:21:10 crc kubenswrapper[4937]: I0225 16:21:10.059242 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-zbnk5"] Feb 25 16:21:10 crc kubenswrapper[4937]: I0225 16:21:10.077854 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1246-account-create-update-c8jvn"] Feb 25 16:21:10 crc kubenswrapper[4937]: I0225 16:21:10.100893 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-zbnk5"] Feb 25 16:21:10 crc kubenswrapper[4937]: I0225 16:21:10.112552 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1246-account-create-update-c8jvn"] Feb 25 16:21:11 crc kubenswrapper[4937]: I0225 16:21:11.047055 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-g8745"] Feb 25 16:21:11 crc kubenswrapper[4937]: I0225 16:21:11.057034 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-g8745"] Feb 25 16:21:11 crc kubenswrapper[4937]: I0225 16:21:11.391044 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2530391c-1cbc-4c0c-ab27-bba9cfcc5149" path="/var/lib/kubelet/pods/2530391c-1cbc-4c0c-ab27-bba9cfcc5149/volumes" Feb 25 16:21:11 crc kubenswrapper[4937]: I0225 16:21:11.393038 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3" path="/var/lib/kubelet/pods/c04fbb9c-6249-4f8a-b67b-4bcb9418bfd3/volumes" Feb 25 16:21:11 crc kubenswrapper[4937]: I0225 16:21:11.394303 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c163075d-e1f7-4252-92aa-17b9bbfe336a" path="/var/lib/kubelet/pods/c163075d-e1f7-4252-92aa-17b9bbfe336a/volumes" Feb 25 16:21:11 crc kubenswrapper[4937]: I0225 16:21:11.494698 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:21:11 crc kubenswrapper[4937]: I0225 16:21:11.494780 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:21:12 crc kubenswrapper[4937]: I0225 16:21:12.041764 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-8bftw"] Feb 25 16:21:12 crc kubenswrapper[4937]: I0225 16:21:12.055161 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5d21-account-create-update-4j4wt"] Feb 25 16:21:12 crc kubenswrapper[4937]: I0225 16:21:12.067520 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c72d-account-create-update-hrdcp"] Feb 25 16:21:12 crc kubenswrapper[4937]: I0225 16:21:12.080175 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-8bftw"] Feb 25 16:21:12 crc kubenswrapper[4937]: I0225 16:21:12.092710 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c72d-account-create-update-hrdcp"] Feb 25 16:21:12 crc kubenswrapper[4937]: I0225 16:21:12.105151 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5d21-account-create-update-4j4wt"] Feb 25 16:21:13 crc kubenswrapper[4937]: I0225 16:21:13.384971 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05e337d3-8903-40f9-84d2-bb1e0e7d4629" path="/var/lib/kubelet/pods/05e337d3-8903-40f9-84d2-bb1e0e7d4629/volumes" Feb 25 16:21:13 crc kubenswrapper[4937]: I0225 16:21:13.386706 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2269b3b-bbaf-44bc-a77c-059195d12b86" path="/var/lib/kubelet/pods/c2269b3b-bbaf-44bc-a77c-059195d12b86/volumes" Feb 25 16:21:13 crc kubenswrapper[4937]: I0225 16:21:13.388056 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdbe96df-78a6-449e-affb-3529fdc05d49" path="/var/lib/kubelet/pods/fdbe96df-78a6-449e-affb-3529fdc05d49/volumes" Feb 25 16:21:33 crc kubenswrapper[4937]: I0225 16:21:33.052054 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-cj6fx"] Feb 25 16:21:33 crc kubenswrapper[4937]: I0225 16:21:33.067255 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-cj6fx"] Feb 25 16:21:33 crc kubenswrapper[4937]: I0225 16:21:33.381480 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2827afe8-442b-4aa9-95f6-48ef3c9a3995" path="/var/lib/kubelet/pods/2827afe8-442b-4aa9-95f6-48ef3c9a3995/volumes" Feb 25 16:21:38 crc kubenswrapper[4937]: I0225 16:21:38.244995 4937 scope.go:117] "RemoveContainer" containerID="171df702add08fe3c36371361badbad1544eaf0ec157059c87ceb8ac24fc2729" Feb 25 16:21:38 crc kubenswrapper[4937]: I0225 16:21:38.290733 4937 scope.go:117] "RemoveContainer" containerID="75a9b7ba6a480607e978fc6e8fb11bd9b61f37f954d1a73eb683ecd0a4a46fd4" Feb 25 16:21:38 crc kubenswrapper[4937]: I0225 16:21:38.336887 4937 scope.go:117] "RemoveContainer" containerID="4d42b95fc6b1e699d92332668e701f0555d27ce7f429fabb5e19e2bedb7254fc" Feb 25 16:21:38 crc kubenswrapper[4937]: I0225 16:21:38.387385 4937 scope.go:117] "RemoveContainer" containerID="5a4ecea5e96581497771457c324bd235ddd5897d43e16f46c92a3ed611831fdd" Feb 25 16:21:38 crc kubenswrapper[4937]: I0225 16:21:38.432290 4937 scope.go:117] "RemoveContainer" containerID="237083e3c1d59c300baa7e892d31a3d9cfcae08915b0c62639d276547f4ca9e0" Feb 25 16:21:38 crc kubenswrapper[4937]: I0225 16:21:38.489244 4937 scope.go:117] "RemoveContainer" containerID="4fce1db9ef8aed180141acc2e44adc14eee51286f48564b0d2fdf56c0b65e530" Feb 25 16:21:38 crc kubenswrapper[4937]: I0225 16:21:38.546007 4937 scope.go:117] "RemoveContainer" containerID="ee001256fc234aa33337e8a6b6fb7c4dbb4f3469fb2eb31fab62ba5c262dd7bf" Feb 25 16:21:38 crc kubenswrapper[4937]: I0225 16:21:38.571459 4937 scope.go:117] "RemoveContainer" containerID="19c82d8b77bd9eee7e468b03627e81ea09173f9635cd3b5c970210c79edd727d" Feb 25 16:21:38 crc kubenswrapper[4937]: I0225 16:21:38.597147 4937 scope.go:117] "RemoveContainer" containerID="29f9cdb7519e86fe9ba15e10efbd9aeddfd04710c8a4ea64818905fd87e80ea2" Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.056813 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f8fc-account-create-update-drksr"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.078480 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-10df-account-create-update-zbrbt"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.101868 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-10df-account-create-update-zbrbt"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.120614 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f8fc-account-create-update-drksr"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.130183 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-gfsht"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.139678 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1efe-account-create-update-wh56p"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.149151 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-9cgvr"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.158429 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1efe-account-create-update-wh56p"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.167660 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-a261-account-create-update-9j4jx"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.176446 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-9cgvr"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.184943 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-gfsht"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.193073 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-a261-account-create-update-9j4jx"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.201604 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-zg54r"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.209880 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-zg54r"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.217920 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gbxlp"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.225428 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gbxlp"] Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.380258 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231f1566-c91c-47f6-9ef5-5a9fbc5b0c57" path="/var/lib/kubelet/pods/231f1566-c91c-47f6-9ef5-5a9fbc5b0c57/volumes" Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.381527 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="296328ce-10bb-42d4-a3e6-3b4986e9b944" path="/var/lib/kubelet/pods/296328ce-10bb-42d4-a3e6-3b4986e9b944/volumes" Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.382071 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f262d7a-313f-4cdc-8a9a-5a5765fb3da0" path="/var/lib/kubelet/pods/2f262d7a-313f-4cdc-8a9a-5a5765fb3da0/volumes" Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.382630 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d3da10-44f5-48ce-8279-6565217f5ab2" path="/var/lib/kubelet/pods/31d3da10-44f5-48ce-8279-6565217f5ab2/volumes" Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.383147 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1c517a3-e534-4a32-aaa1-23ad2d42fc10" path="/var/lib/kubelet/pods/a1c517a3-e534-4a32-aaa1-23ad2d42fc10/volumes" Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.383671 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1373796-a25a-406b-a417-a26ff42bbce4" path="/var/lib/kubelet/pods/c1373796-a25a-406b-a417-a26ff42bbce4/volumes" Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.384192 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca05068-9163-4e5f-abc8-c98462b3b6c8" path="/var/lib/kubelet/pods/cca05068-9163-4e5f-abc8-c98462b3b6c8/volumes" Feb 25 16:21:39 crc kubenswrapper[4937]: I0225 16:21:39.384747 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fceee1fa-35a1-4b5d-aca8-054ab1816927" path="/var/lib/kubelet/pods/fceee1fa-35a1-4b5d-aca8-054ab1816927/volumes" Feb 25 16:21:41 crc kubenswrapper[4937]: I0225 16:21:41.495992 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:21:41 crc kubenswrapper[4937]: I0225 16:21:41.496231 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:21:45 crc kubenswrapper[4937]: I0225 16:21:45.062432 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-4nvj6"] Feb 25 16:21:45 crc kubenswrapper[4937]: I0225 16:21:45.074526 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-w8hgr"] Feb 25 16:21:45 crc kubenswrapper[4937]: I0225 16:21:45.088960 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-4nvj6"] Feb 25 16:21:45 crc kubenswrapper[4937]: I0225 16:21:45.098738 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-w8hgr"] Feb 25 16:21:45 crc kubenswrapper[4937]: I0225 16:21:45.383629 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88c34e87-5116-4387-9e2b-e5fbcedb6f55" path="/var/lib/kubelet/pods/88c34e87-5116-4387-9e2b-e5fbcedb6f55/volumes" Feb 25 16:21:45 crc kubenswrapper[4937]: I0225 16:21:45.384506 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d" path="/var/lib/kubelet/pods/8d70e0ad-27d8-4998-988f-1b5e2c7b8a7d/volumes" Feb 25 16:22:00 crc kubenswrapper[4937]: I0225 16:22:00.161072 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533942-kqhw5"] Feb 25 16:22:00 crc kubenswrapper[4937]: I0225 16:22:00.164242 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533942-kqhw5" Feb 25 16:22:00 crc kubenswrapper[4937]: I0225 16:22:00.166279 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:22:00 crc kubenswrapper[4937]: I0225 16:22:00.169279 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:22:00 crc kubenswrapper[4937]: I0225 16:22:00.169880 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:22:00 crc kubenswrapper[4937]: I0225 16:22:00.174287 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533942-kqhw5"] Feb 25 16:22:00 crc kubenswrapper[4937]: I0225 16:22:00.310337 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpvmg\" (UniqueName: \"kubernetes.io/projected/c7fbac47-0089-4e3d-99c6-33efb288a426-kube-api-access-kpvmg\") pod \"auto-csr-approver-29533942-kqhw5\" (UID: \"c7fbac47-0089-4e3d-99c6-33efb288a426\") " pod="openshift-infra/auto-csr-approver-29533942-kqhw5" Feb 25 16:22:00 crc kubenswrapper[4937]: I0225 16:22:00.412437 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpvmg\" (UniqueName: \"kubernetes.io/projected/c7fbac47-0089-4e3d-99c6-33efb288a426-kube-api-access-kpvmg\") pod \"auto-csr-approver-29533942-kqhw5\" (UID: \"c7fbac47-0089-4e3d-99c6-33efb288a426\") " pod="openshift-infra/auto-csr-approver-29533942-kqhw5" Feb 25 16:22:00 crc kubenswrapper[4937]: I0225 16:22:00.446260 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpvmg\" (UniqueName: \"kubernetes.io/projected/c7fbac47-0089-4e3d-99c6-33efb288a426-kube-api-access-kpvmg\") pod \"auto-csr-approver-29533942-kqhw5\" (UID: \"c7fbac47-0089-4e3d-99c6-33efb288a426\") " pod="openshift-infra/auto-csr-approver-29533942-kqhw5" Feb 25 16:22:00 crc kubenswrapper[4937]: I0225 16:22:00.492435 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533942-kqhw5" Feb 25 16:22:02 crc kubenswrapper[4937]: W0225 16:22:02.449233 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7fbac47_0089_4e3d_99c6_33efb288a426.slice/crio-d05e97f81f412b4665202a0b3722a681f82082d9169975d572e0b7a6304c1e35 WatchSource:0}: Error finding container d05e97f81f412b4665202a0b3722a681f82082d9169975d572e0b7a6304c1e35: Status 404 returned error can't find the container with id d05e97f81f412b4665202a0b3722a681f82082d9169975d572e0b7a6304c1e35 Feb 25 16:22:02 crc kubenswrapper[4937]: I0225 16:22:02.449394 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533942-kqhw5"] Feb 25 16:22:03 crc kubenswrapper[4937]: I0225 16:22:03.405226 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533942-kqhw5" event={"ID":"c7fbac47-0089-4e3d-99c6-33efb288a426","Type":"ContainerStarted","Data":"d05e97f81f412b4665202a0b3722a681f82082d9169975d572e0b7a6304c1e35"} Feb 25 16:22:05 crc kubenswrapper[4937]: I0225 16:22:05.426985 4937 generic.go:334] "Generic (PLEG): container finished" podID="c7fbac47-0089-4e3d-99c6-33efb288a426" containerID="620fcd4241f5384fb706d1ba9a49c4ada7f8eb59dbd0647d795ac1d8db850525" exitCode=0 Feb 25 16:22:05 crc kubenswrapper[4937]: I0225 16:22:05.427053 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533942-kqhw5" event={"ID":"c7fbac47-0089-4e3d-99c6-33efb288a426","Type":"ContainerDied","Data":"620fcd4241f5384fb706d1ba9a49c4ada7f8eb59dbd0647d795ac1d8db850525"} Feb 25 16:22:06 crc kubenswrapper[4937]: I0225 16:22:06.850127 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533942-kqhw5" Feb 25 16:22:06 crc kubenswrapper[4937]: I0225 16:22:06.946803 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpvmg\" (UniqueName: \"kubernetes.io/projected/c7fbac47-0089-4e3d-99c6-33efb288a426-kube-api-access-kpvmg\") pod \"c7fbac47-0089-4e3d-99c6-33efb288a426\" (UID: \"c7fbac47-0089-4e3d-99c6-33efb288a426\") " Feb 25 16:22:06 crc kubenswrapper[4937]: I0225 16:22:06.953883 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7fbac47-0089-4e3d-99c6-33efb288a426-kube-api-access-kpvmg" (OuterVolumeSpecName: "kube-api-access-kpvmg") pod "c7fbac47-0089-4e3d-99c6-33efb288a426" (UID: "c7fbac47-0089-4e3d-99c6-33efb288a426"). InnerVolumeSpecName "kube-api-access-kpvmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:22:07 crc kubenswrapper[4937]: I0225 16:22:07.049754 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpvmg\" (UniqueName: \"kubernetes.io/projected/c7fbac47-0089-4e3d-99c6-33efb288a426-kube-api-access-kpvmg\") on node \"crc\" DevicePath \"\"" Feb 25 16:22:07 crc kubenswrapper[4937]: I0225 16:22:07.446285 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533942-kqhw5" event={"ID":"c7fbac47-0089-4e3d-99c6-33efb288a426","Type":"ContainerDied","Data":"d05e97f81f412b4665202a0b3722a681f82082d9169975d572e0b7a6304c1e35"} Feb 25 16:22:07 crc kubenswrapper[4937]: I0225 16:22:07.446331 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d05e97f81f412b4665202a0b3722a681f82082d9169975d572e0b7a6304c1e35" Feb 25 16:22:07 crc kubenswrapper[4937]: I0225 16:22:07.446340 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533942-kqhw5" Feb 25 16:22:07 crc kubenswrapper[4937]: I0225 16:22:07.916258 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533936-7gx2j"] Feb 25 16:22:07 crc kubenswrapper[4937]: I0225 16:22:07.927670 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533936-7gx2j"] Feb 25 16:22:09 crc kubenswrapper[4937]: I0225 16:22:09.379464 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e3f7da7-06ca-4408-a2ff-c890385edcf0" path="/var/lib/kubelet/pods/3e3f7da7-06ca-4408-a2ff-c890385edcf0/volumes" Feb 25 16:22:11 crc kubenswrapper[4937]: I0225 16:22:11.495024 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:22:11 crc kubenswrapper[4937]: I0225 16:22:11.495512 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:22:11 crc kubenswrapper[4937]: I0225 16:22:11.495588 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 16:22:11 crc kubenswrapper[4937]: I0225 16:22:11.496871 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a1902f50a856ad1bac31a5abd173e354d882341ff395f33b062b6ec2ed08e38e"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 16:22:11 crc kubenswrapper[4937]: I0225 16:22:11.496995 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://a1902f50a856ad1bac31a5abd173e354d882341ff395f33b062b6ec2ed08e38e" gracePeriod=600 Feb 25 16:22:12 crc kubenswrapper[4937]: I0225 16:22:12.498282 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="a1902f50a856ad1bac31a5abd173e354d882341ff395f33b062b6ec2ed08e38e" exitCode=0 Feb 25 16:22:12 crc kubenswrapper[4937]: I0225 16:22:12.498359 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"a1902f50a856ad1bac31a5abd173e354d882341ff395f33b062b6ec2ed08e38e"} Feb 25 16:22:12 crc kubenswrapper[4937]: I0225 16:22:12.498691 4937 scope.go:117] "RemoveContainer" containerID="afa6295ddd2d73c55c3228c83e5c7749598a0b0d21741df0466ba5796213382e" Feb 25 16:22:13 crc kubenswrapper[4937]: I0225 16:22:13.513416 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3"} Feb 25 16:22:29 crc kubenswrapper[4937]: I0225 16:22:29.699982 4937 generic.go:334] "Generic (PLEG): container finished" podID="e34d42d5-94de-45fe-b002-65da3cd1d49d" containerID="a6ae96af4fca6c5b8c6fd5329143b98858202373717cfb774b9c89184a5d9aa2" exitCode=0 Feb 25 16:22:29 crc kubenswrapper[4937]: I0225 16:22:29.700034 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" event={"ID":"e34d42d5-94de-45fe-b002-65da3cd1d49d","Type":"ContainerDied","Data":"a6ae96af4fca6c5b8c6fd5329143b98858202373717cfb774b9c89184a5d9aa2"} Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.309500 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.398415 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e34d42d5-94de-45fe-b002-65da3cd1d49d-ssh-key-openstack-edpm-ipam\") pod \"e34d42d5-94de-45fe-b002-65da3cd1d49d\" (UID: \"e34d42d5-94de-45fe-b002-65da3cd1d49d\") " Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.398913 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e34d42d5-94de-45fe-b002-65da3cd1d49d-inventory\") pod \"e34d42d5-94de-45fe-b002-65da3cd1d49d\" (UID: \"e34d42d5-94de-45fe-b002-65da3cd1d49d\") " Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.399174 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfpb2\" (UniqueName: \"kubernetes.io/projected/e34d42d5-94de-45fe-b002-65da3cd1d49d-kube-api-access-pfpb2\") pod \"e34d42d5-94de-45fe-b002-65da3cd1d49d\" (UID: \"e34d42d5-94de-45fe-b002-65da3cd1d49d\") " Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.405954 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e34d42d5-94de-45fe-b002-65da3cd1d49d-kube-api-access-pfpb2" (OuterVolumeSpecName: "kube-api-access-pfpb2") pod "e34d42d5-94de-45fe-b002-65da3cd1d49d" (UID: "e34d42d5-94de-45fe-b002-65da3cd1d49d"). InnerVolumeSpecName "kube-api-access-pfpb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.476904 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e34d42d5-94de-45fe-b002-65da3cd1d49d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e34d42d5-94de-45fe-b002-65da3cd1d49d" (UID: "e34d42d5-94de-45fe-b002-65da3cd1d49d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.477848 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e34d42d5-94de-45fe-b002-65da3cd1d49d-inventory" (OuterVolumeSpecName: "inventory") pod "e34d42d5-94de-45fe-b002-65da3cd1d49d" (UID: "e34d42d5-94de-45fe-b002-65da3cd1d49d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.502679 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e34d42d5-94de-45fe-b002-65da3cd1d49d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.502712 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e34d42d5-94de-45fe-b002-65da3cd1d49d-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.502724 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfpb2\" (UniqueName: \"kubernetes.io/projected/e34d42d5-94de-45fe-b002-65da3cd1d49d-kube-api-access-pfpb2\") on node \"crc\" DevicePath \"\"" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.720058 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" event={"ID":"e34d42d5-94de-45fe-b002-65da3cd1d49d","Type":"ContainerDied","Data":"bac7a84e7f18baf9972e26b861f03aef7b3c1f1ce001b98e4df4ae5cddd73ffd"} Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.720102 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bac7a84e7f18baf9972e26b861f03aef7b3c1f1ce001b98e4df4ae5cddd73ffd" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.720167 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-29n8w" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.833846 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54"] Feb 25 16:22:31 crc kubenswrapper[4937]: E0225 16:22:31.834385 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e34d42d5-94de-45fe-b002-65da3cd1d49d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.834409 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34d42d5-94de-45fe-b002-65da3cd1d49d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 25 16:22:31 crc kubenswrapper[4937]: E0225 16:22:31.834469 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7fbac47-0089-4e3d-99c6-33efb288a426" containerName="oc" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.834479 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7fbac47-0089-4e3d-99c6-33efb288a426" containerName="oc" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.834743 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7fbac47-0089-4e3d-99c6-33efb288a426" containerName="oc" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.834774 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e34d42d5-94de-45fe-b002-65da3cd1d49d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.835766 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.837652 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.838226 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.838459 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.839786 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.852741 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54"] Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.912661 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dfsk\" (UniqueName: \"kubernetes.io/projected/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-kube-api-access-4dfsk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h5j54\" (UID: \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.912741 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h5j54\" (UID: \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" Feb 25 16:22:31 crc kubenswrapper[4937]: I0225 16:22:31.913421 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h5j54\" (UID: \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" Feb 25 16:22:32 crc kubenswrapper[4937]: I0225 16:22:32.015779 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h5j54\" (UID: \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" Feb 25 16:22:32 crc kubenswrapper[4937]: I0225 16:22:32.016016 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dfsk\" (UniqueName: \"kubernetes.io/projected/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-kube-api-access-4dfsk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h5j54\" (UID: \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" Feb 25 16:22:32 crc kubenswrapper[4937]: I0225 16:22:32.016069 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h5j54\" (UID: \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" Feb 25 16:22:32 crc kubenswrapper[4937]: I0225 16:22:32.021093 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h5j54\" (UID: \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" Feb 25 16:22:32 crc kubenswrapper[4937]: I0225 16:22:32.021614 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h5j54\" (UID: \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" Feb 25 16:22:32 crc kubenswrapper[4937]: I0225 16:22:32.040759 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dfsk\" (UniqueName: \"kubernetes.io/projected/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-kube-api-access-4dfsk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h5j54\" (UID: \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" Feb 25 16:22:32 crc kubenswrapper[4937]: I0225 16:22:32.172996 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" Feb 25 16:22:32 crc kubenswrapper[4937]: I0225 16:22:32.748935 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54"] Feb 25 16:22:32 crc kubenswrapper[4937]: W0225 16:22:32.750694 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0caaa2f_df02_4bb7_a490_f3333d6c47a2.slice/crio-c0e596e27f7c33bde3204159f99a046c538abfa4829996b924d7d2fcc33efde0 WatchSource:0}: Error finding container c0e596e27f7c33bde3204159f99a046c538abfa4829996b924d7d2fcc33efde0: Status 404 returned error can't find the container with id c0e596e27f7c33bde3204159f99a046c538abfa4829996b924d7d2fcc33efde0 Feb 25 16:22:32 crc kubenswrapper[4937]: I0225 16:22:32.753865 4937 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 16:22:33 crc kubenswrapper[4937]: I0225 16:22:33.768314 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" event={"ID":"d0caaa2f-df02-4bb7-a490-f3333d6c47a2","Type":"ContainerStarted","Data":"c0e596e27f7c33bde3204159f99a046c538abfa4829996b924d7d2fcc33efde0"} Feb 25 16:22:34 crc kubenswrapper[4937]: I0225 16:22:34.054323 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-4vmm4"] Feb 25 16:22:34 crc kubenswrapper[4937]: I0225 16:22:34.069378 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-4vmm4"] Feb 25 16:22:34 crc kubenswrapper[4937]: I0225 16:22:34.781532 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" event={"ID":"d0caaa2f-df02-4bb7-a490-f3333d6c47a2","Type":"ContainerStarted","Data":"6a70ca2f9d04d20f678b207d9f3a622e77cfee8c20d554a0a748dd264bc18f51"} Feb 25 16:22:34 crc kubenswrapper[4937]: I0225 16:22:34.808448 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" podStartSLOduration=2.5640197970000003 podStartE2EDuration="3.80842007s" podCreationTimestamp="2026-02-25 16:22:31 +0000 UTC" firstStartedPulling="2026-02-25 16:22:32.753533183 +0000 UTC m=+2203.766925073" lastFinishedPulling="2026-02-25 16:22:33.997933446 +0000 UTC m=+2205.011325346" observedRunningTime="2026-02-25 16:22:34.799316402 +0000 UTC m=+2205.812708302" watchObservedRunningTime="2026-02-25 16:22:34.80842007 +0000 UTC m=+2205.821811970" Feb 25 16:22:35 crc kubenswrapper[4937]: I0225 16:22:35.380575 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90764917-3dc9-4778-b224-67cb4ae1e49d" path="/var/lib/kubelet/pods/90764917-3dc9-4778-b224-67cb4ae1e49d/volumes" Feb 25 16:22:38 crc kubenswrapper[4937]: I0225 16:22:38.811950 4937 scope.go:117] "RemoveContainer" containerID="5f67fb26717060327cc02c8f7a5125a3d6874d9200c9e2c5153551570e7d7ff0" Feb 25 16:22:38 crc kubenswrapper[4937]: I0225 16:22:38.849564 4937 scope.go:117] "RemoveContainer" containerID="d28f74bba9cf2132a8c1737596d2cb91106e993882316795fd11a1578d272172" Feb 25 16:22:38 crc kubenswrapper[4937]: I0225 16:22:38.899431 4937 scope.go:117] "RemoveContainer" containerID="c11b6b2b3762e7a7abc350a999b42beb55931df1a2229a4f9d7409f744605c7c" Feb 25 16:22:38 crc kubenswrapper[4937]: I0225 16:22:38.976082 4937 scope.go:117] "RemoveContainer" containerID="bb3ff9c2adb409ba539fa1cedc7bac2749456d98d0aea1cc56ea7135c55ab4b8" Feb 25 16:22:38 crc kubenswrapper[4937]: I0225 16:22:38.995836 4937 scope.go:117] "RemoveContainer" containerID="274b9ce56fc7b77d6878d871391c4f2e4371c69ba071de597b85a964120f89ac" Feb 25 16:22:39 crc kubenswrapper[4937]: I0225 16:22:39.070019 4937 scope.go:117] "RemoveContainer" containerID="c3cf326623410f2d9aca12fc12f29fcb1221a82e4948c138eb02590abd9c8891" Feb 25 16:22:39 crc kubenswrapper[4937]: I0225 16:22:39.124853 4937 scope.go:117] "RemoveContainer" containerID="cb23e5ae864b685389756b3396da69b0ea65114200286df72ea4a90a4291c56c" Feb 25 16:22:39 crc kubenswrapper[4937]: I0225 16:22:39.163056 4937 scope.go:117] "RemoveContainer" containerID="f828440b21ea18ae8263a8d3f11bc51cf5f99ec4284f45e31327cfe60dba52ee" Feb 25 16:22:39 crc kubenswrapper[4937]: I0225 16:22:39.350036 4937 scope.go:117] "RemoveContainer" containerID="a2478a6237d1252b5ffeaeaefc5c4093115de93e2dfa46d9ab0e1cae15835b1a" Feb 25 16:22:39 crc kubenswrapper[4937]: I0225 16:22:39.377833 4937 scope.go:117] "RemoveContainer" containerID="f20c055947b7a96675aa34ee94834d7d0eb316b3207ca30e8ed95725a2eac4c8" Feb 25 16:22:39 crc kubenswrapper[4937]: I0225 16:22:39.409711 4937 scope.go:117] "RemoveContainer" containerID="04d2954c76170b54b664e31e54d5a6f8eeaa54b13b5ee8c72c544d7f49d75bda" Feb 25 16:22:39 crc kubenswrapper[4937]: I0225 16:22:39.457306 4937 scope.go:117] "RemoveContainer" containerID="bac0b4953d10367c1e1de1365b214139835f7a2eb291f9aad46dfc9a9da07968" Feb 25 16:22:47 crc kubenswrapper[4937]: I0225 16:22:47.052655 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8tmw5"] Feb 25 16:22:47 crc kubenswrapper[4937]: I0225 16:22:47.061291 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8tmw5"] Feb 25 16:22:47 crc kubenswrapper[4937]: I0225 16:22:47.402534 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f988c32-d57e-4e63-add5-1e86a8818641" path="/var/lib/kubelet/pods/8f988c32-d57e-4e63-add5-1e86a8818641/volumes" Feb 25 16:22:48 crc kubenswrapper[4937]: I0225 16:22:48.031175 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ldjsj"] Feb 25 16:22:48 crc kubenswrapper[4937]: I0225 16:22:48.045650 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ldjsj"] Feb 25 16:22:49 crc kubenswrapper[4937]: I0225 16:22:49.380287 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69b3388f-2762-4ebe-a014-e1740aee3b66" path="/var/lib/kubelet/pods/69b3388f-2762-4ebe-a014-e1740aee3b66/volumes" Feb 25 16:23:00 crc kubenswrapper[4937]: I0225 16:23:00.040072 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-7w4sz"] Feb 25 16:23:00 crc kubenswrapper[4937]: I0225 16:23:00.056185 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-7w4sz"] Feb 25 16:23:01 crc kubenswrapper[4937]: I0225 16:23:01.390203 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a537ec-7743-44bd-b428-fa52adf39305" path="/var/lib/kubelet/pods/38a537ec-7743-44bd-b428-fa52adf39305/volumes" Feb 25 16:23:05 crc kubenswrapper[4937]: I0225 16:23:05.045878 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6rpk2"] Feb 25 16:23:05 crc kubenswrapper[4937]: I0225 16:23:05.059264 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6rpk2"] Feb 25 16:23:05 crc kubenswrapper[4937]: I0225 16:23:05.378172 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="006fb5e7-a244-4758-8065-3615f5a2b9b7" path="/var/lib/kubelet/pods/006fb5e7-a244-4758-8065-3615f5a2b9b7/volumes" Feb 25 16:23:36 crc kubenswrapper[4937]: I0225 16:23:36.046018 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-ck6bq"] Feb 25 16:23:36 crc kubenswrapper[4937]: I0225 16:23:36.057817 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-ck6bq"] Feb 25 16:23:37 crc kubenswrapper[4937]: I0225 16:23:37.034460 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-996c-account-create-update-dqbhk"] Feb 25 16:23:37 crc kubenswrapper[4937]: I0225 16:23:37.048014 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-dd47-account-create-update-m7rq7"] Feb 25 16:23:37 crc kubenswrapper[4937]: I0225 16:23:37.061365 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-dd47-account-create-update-m7rq7"] Feb 25 16:23:37 crc kubenswrapper[4937]: I0225 16:23:37.072396 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-996c-account-create-update-dqbhk"] Feb 25 16:23:37 crc kubenswrapper[4937]: I0225 16:23:37.385668 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b4ba23-a361-40f5-9025-a80203afb802" path="/var/lib/kubelet/pods/30b4ba23-a361-40f5-9025-a80203afb802/volumes" Feb 25 16:23:37 crc kubenswrapper[4937]: I0225 16:23:37.387206 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70916290-1479-4642-b8c3-cd571d51ba42" path="/var/lib/kubelet/pods/70916290-1479-4642-b8c3-cd571d51ba42/volumes" Feb 25 16:23:37 crc kubenswrapper[4937]: I0225 16:23:37.388685 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc7039f2-d86f-4915-92de-5eab8c16f281" path="/var/lib/kubelet/pods/dc7039f2-d86f-4915-92de-5eab8c16f281/volumes" Feb 25 16:23:38 crc kubenswrapper[4937]: I0225 16:23:38.032588 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-vhl9c"] Feb 25 16:23:38 crc kubenswrapper[4937]: I0225 16:23:38.043584 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-gp8qg"] Feb 25 16:23:38 crc kubenswrapper[4937]: I0225 16:23:38.055504 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c25c-account-create-update-pjkhj"] Feb 25 16:23:38 crc kubenswrapper[4937]: I0225 16:23:38.063185 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-vhl9c"] Feb 25 16:23:38 crc kubenswrapper[4937]: I0225 16:23:38.072431 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-gp8qg"] Feb 25 16:23:38 crc kubenswrapper[4937]: I0225 16:23:38.085284 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c25c-account-create-update-pjkhj"] Feb 25 16:23:39 crc kubenswrapper[4937]: I0225 16:23:39.378746 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f5f5588-4ffb-43f5-a891-0a61f46ab7af" path="/var/lib/kubelet/pods/0f5f5588-4ffb-43f5-a891-0a61f46ab7af/volumes" Feb 25 16:23:39 crc kubenswrapper[4937]: I0225 16:23:39.379337 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d324c2c-8a6d-431b-92d0-b735158fd9fa" path="/var/lib/kubelet/pods/3d324c2c-8a6d-431b-92d0-b735158fd9fa/volumes" Feb 25 16:23:39 crc kubenswrapper[4937]: I0225 16:23:39.379871 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aca08b6d-8b50-4471-8ae5-0c3b517ef2b3" path="/var/lib/kubelet/pods/aca08b6d-8b50-4471-8ae5-0c3b517ef2b3/volumes" Feb 25 16:23:39 crc kubenswrapper[4937]: I0225 16:23:39.900434 4937 scope.go:117] "RemoveContainer" containerID="17fe8c26e8a7d51a9c951e2df23ea4327813d2108f24fe58b6c7693b73ddab73" Feb 25 16:23:39 crc kubenswrapper[4937]: I0225 16:23:39.928970 4937 scope.go:117] "RemoveContainer" containerID="39c1d35910d9965962a2c3133843aac7413ceb0c3af56828f4d782dae8ca4270" Feb 25 16:23:40 crc kubenswrapper[4937]: I0225 16:23:40.022060 4937 scope.go:117] "RemoveContainer" containerID="bac64cbf62bbf5c9cb7526af1128d38eef5d873e3a5f17b5965cad8e809efe3c" Feb 25 16:23:40 crc kubenswrapper[4937]: I0225 16:23:40.072595 4937 scope.go:117] "RemoveContainer" containerID="35f2454e80988e6d3382331eb99b5326ab08f19cd000e35acdf3a10e7b3e808b" Feb 25 16:23:40 crc kubenswrapper[4937]: I0225 16:23:40.123222 4937 scope.go:117] "RemoveContainer" containerID="df8cb711efe73603c82bd5bba6a3b14feecdc2e93753b56ee494be611bfbbc78" Feb 25 16:23:40 crc kubenswrapper[4937]: I0225 16:23:40.171131 4937 scope.go:117] "RemoveContainer" containerID="24d96139ad175c0781598244dc1d9972d8a382730deb3a8b23a5a63248c6f156" Feb 25 16:23:40 crc kubenswrapper[4937]: I0225 16:23:40.227882 4937 scope.go:117] "RemoveContainer" containerID="db1e85c0e17d1bfd820f6c49914c2c6689c8bc86c7d87db309f01c55431a9af9" Feb 25 16:23:40 crc kubenswrapper[4937]: I0225 16:23:40.252088 4937 scope.go:117] "RemoveContainer" containerID="e1a6a643635b7ded6f035d06b0c8511691bcf92f830f74575dc22a413a153f75" Feb 25 16:23:40 crc kubenswrapper[4937]: I0225 16:23:40.306267 4937 scope.go:117] "RemoveContainer" containerID="77432d56416122942e2324d15dfdcdb0a8a8399a29c6d11a876e4b1f973dbf7d" Feb 25 16:23:40 crc kubenswrapper[4937]: I0225 16:23:40.331686 4937 scope.go:117] "RemoveContainer" containerID="f395a5b1e304c3bb44e2b84ed50e0c9ed2ac844e1a470dc98f6a712d508cda10" Feb 25 16:23:45 crc kubenswrapper[4937]: I0225 16:23:45.001589 4937 generic.go:334] "Generic (PLEG): container finished" podID="d0caaa2f-df02-4bb7-a490-f3333d6c47a2" containerID="6a70ca2f9d04d20f678b207d9f3a622e77cfee8c20d554a0a748dd264bc18f51" exitCode=0 Feb 25 16:23:45 crc kubenswrapper[4937]: I0225 16:23:45.001799 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" event={"ID":"d0caaa2f-df02-4bb7-a490-f3333d6c47a2","Type":"ContainerDied","Data":"6a70ca2f9d04d20f678b207d9f3a622e77cfee8c20d554a0a748dd264bc18f51"} Feb 25 16:23:46 crc kubenswrapper[4937]: I0225 16:23:46.611285 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" Feb 25 16:23:46 crc kubenswrapper[4937]: I0225 16:23:46.752191 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-inventory\") pod \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\" (UID: \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\") " Feb 25 16:23:46 crc kubenswrapper[4937]: I0225 16:23:46.752234 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-ssh-key-openstack-edpm-ipam\") pod \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\" (UID: \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\") " Feb 25 16:23:46 crc kubenswrapper[4937]: I0225 16:23:46.752298 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dfsk\" (UniqueName: \"kubernetes.io/projected/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-kube-api-access-4dfsk\") pod \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\" (UID: \"d0caaa2f-df02-4bb7-a490-f3333d6c47a2\") " Feb 25 16:23:46 crc kubenswrapper[4937]: I0225 16:23:46.757949 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-kube-api-access-4dfsk" (OuterVolumeSpecName: "kube-api-access-4dfsk") pod "d0caaa2f-df02-4bb7-a490-f3333d6c47a2" (UID: "d0caaa2f-df02-4bb7-a490-f3333d6c47a2"). InnerVolumeSpecName "kube-api-access-4dfsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:23:46 crc kubenswrapper[4937]: I0225 16:23:46.778872 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d0caaa2f-df02-4bb7-a490-f3333d6c47a2" (UID: "d0caaa2f-df02-4bb7-a490-f3333d6c47a2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:23:46 crc kubenswrapper[4937]: I0225 16:23:46.787766 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-inventory" (OuterVolumeSpecName: "inventory") pod "d0caaa2f-df02-4bb7-a490-f3333d6c47a2" (UID: "d0caaa2f-df02-4bb7-a490-f3333d6c47a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:23:46 crc kubenswrapper[4937]: I0225 16:23:46.854609 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:23:46 crc kubenswrapper[4937]: I0225 16:23:46.854645 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:23:46 crc kubenswrapper[4937]: I0225 16:23:46.854661 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dfsk\" (UniqueName: \"kubernetes.io/projected/d0caaa2f-df02-4bb7-a490-f3333d6c47a2-kube-api-access-4dfsk\") on node \"crc\" DevicePath \"\"" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.020024 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" event={"ID":"d0caaa2f-df02-4bb7-a490-f3333d6c47a2","Type":"ContainerDied","Data":"c0e596e27f7c33bde3204159f99a046c538abfa4829996b924d7d2fcc33efde0"} Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.020068 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0e596e27f7c33bde3204159f99a046c538abfa4829996b924d7d2fcc33efde0" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.020074 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h5j54" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.115841 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t"] Feb 25 16:23:47 crc kubenswrapper[4937]: E0225 16:23:47.116274 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0caaa2f-df02-4bb7-a490-f3333d6c47a2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.116295 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0caaa2f-df02-4bb7-a490-f3333d6c47a2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.116527 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0caaa2f-df02-4bb7-a490-f3333d6c47a2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.117253 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.121236 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.121236 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.121862 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.125403 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.131213 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t"] Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.261139 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t\" (UID: \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.261385 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52xdp\" (UniqueName: \"kubernetes.io/projected/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-kube-api-access-52xdp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t\" (UID: \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.261470 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t\" (UID: \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.363575 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52xdp\" (UniqueName: \"kubernetes.io/projected/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-kube-api-access-52xdp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t\" (UID: \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.363929 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t\" (UID: \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.363958 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t\" (UID: \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.368178 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t\" (UID: \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.369304 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t\" (UID: \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.380312 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52xdp\" (UniqueName: \"kubernetes.io/projected/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-kube-api-access-52xdp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t\" (UID: \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" Feb 25 16:23:47 crc kubenswrapper[4937]: I0225 16:23:47.436434 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" Feb 25 16:23:48 crc kubenswrapper[4937]: I0225 16:23:48.037028 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t"] Feb 25 16:23:49 crc kubenswrapper[4937]: I0225 16:23:49.040473 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" event={"ID":"cceb45e3-0685-45fb-b7c3-cf18ccb0649b","Type":"ContainerStarted","Data":"4861144cf9454fb62cfc70fa6d3e6c7f83ab4dff0c59f6803bcaa7b6b6d8a28b"} Feb 25 16:23:49 crc kubenswrapper[4937]: I0225 16:23:49.040840 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" event={"ID":"cceb45e3-0685-45fb-b7c3-cf18ccb0649b","Type":"ContainerStarted","Data":"758b30217964fc0852e2b10aaab8de5b195e625684f9b27df2ccae30108ebbbd"} Feb 25 16:23:49 crc kubenswrapper[4937]: I0225 16:23:49.057453 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" podStartSLOduration=1.518089937 podStartE2EDuration="2.057432419s" podCreationTimestamp="2026-02-25 16:23:47 +0000 UTC" firstStartedPulling="2026-02-25 16:23:48.040877492 +0000 UTC m=+2279.054269382" lastFinishedPulling="2026-02-25 16:23:48.580219974 +0000 UTC m=+2279.593611864" observedRunningTime="2026-02-25 16:23:49.05546482 +0000 UTC m=+2280.068856720" watchObservedRunningTime="2026-02-25 16:23:49.057432419 +0000 UTC m=+2280.070824309" Feb 25 16:23:54 crc kubenswrapper[4937]: I0225 16:23:54.086613 4937 generic.go:334] "Generic (PLEG): container finished" podID="cceb45e3-0685-45fb-b7c3-cf18ccb0649b" containerID="4861144cf9454fb62cfc70fa6d3e6c7f83ab4dff0c59f6803bcaa7b6b6d8a28b" exitCode=0 Feb 25 16:23:54 crc kubenswrapper[4937]: I0225 16:23:54.086682 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" event={"ID":"cceb45e3-0685-45fb-b7c3-cf18ccb0649b","Type":"ContainerDied","Data":"4861144cf9454fb62cfc70fa6d3e6c7f83ab4dff0c59f6803bcaa7b6b6d8a28b"} Feb 25 16:23:55 crc kubenswrapper[4937]: I0225 16:23:55.626360 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" Feb 25 16:23:55 crc kubenswrapper[4937]: I0225 16:23:55.768031 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-ssh-key-openstack-edpm-ipam\") pod \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\" (UID: \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\") " Feb 25 16:23:55 crc kubenswrapper[4937]: I0225 16:23:55.768087 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52xdp\" (UniqueName: \"kubernetes.io/projected/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-kube-api-access-52xdp\") pod \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\" (UID: \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\") " Feb 25 16:23:55 crc kubenswrapper[4937]: I0225 16:23:55.768142 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-inventory\") pod \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\" (UID: \"cceb45e3-0685-45fb-b7c3-cf18ccb0649b\") " Feb 25 16:23:55 crc kubenswrapper[4937]: I0225 16:23:55.779562 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-kube-api-access-52xdp" (OuterVolumeSpecName: "kube-api-access-52xdp") pod "cceb45e3-0685-45fb-b7c3-cf18ccb0649b" (UID: "cceb45e3-0685-45fb-b7c3-cf18ccb0649b"). InnerVolumeSpecName "kube-api-access-52xdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:23:55 crc kubenswrapper[4937]: I0225 16:23:55.802815 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cceb45e3-0685-45fb-b7c3-cf18ccb0649b" (UID: "cceb45e3-0685-45fb-b7c3-cf18ccb0649b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:23:55 crc kubenswrapper[4937]: I0225 16:23:55.809999 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-inventory" (OuterVolumeSpecName: "inventory") pod "cceb45e3-0685-45fb-b7c3-cf18ccb0649b" (UID: "cceb45e3-0685-45fb-b7c3-cf18ccb0649b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:23:55 crc kubenswrapper[4937]: I0225 16:23:55.870903 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:23:55 crc kubenswrapper[4937]: I0225 16:23:55.870933 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52xdp\" (UniqueName: \"kubernetes.io/projected/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-kube-api-access-52xdp\") on node \"crc\" DevicePath \"\"" Feb 25 16:23:55 crc kubenswrapper[4937]: I0225 16:23:55.870943 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cceb45e3-0685-45fb-b7c3-cf18ccb0649b-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.109308 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" event={"ID":"cceb45e3-0685-45fb-b7c3-cf18ccb0649b","Type":"ContainerDied","Data":"758b30217964fc0852e2b10aaab8de5b195e625684f9b27df2ccae30108ebbbd"} Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.109346 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.109351 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="758b30217964fc0852e2b10aaab8de5b195e625684f9b27df2ccae30108ebbbd" Feb 25 16:23:56 crc kubenswrapper[4937]: E0225 16:23:56.194927 4937 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcceb45e3_0685_45fb_b7c3_cf18ccb0649b.slice\": RecentStats: unable to find data in memory cache]" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.203147 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn"] Feb 25 16:23:56 crc kubenswrapper[4937]: E0225 16:23:56.203652 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cceb45e3-0685-45fb-b7c3-cf18ccb0649b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.203674 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="cceb45e3-0685-45fb-b7c3-cf18ccb0649b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.205098 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="cceb45e3-0685-45fb-b7c3-cf18ccb0649b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.221772 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.245341 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn"] Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.260201 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.260266 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.260437 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.260612 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.380233 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4784f56a-332c-45b1-b121-ec925aece823-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2mgkn\" (UID: \"4784f56a-332c-45b1-b121-ec925aece823\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.380322 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c7tg\" (UniqueName: \"kubernetes.io/projected/4784f56a-332c-45b1-b121-ec925aece823-kube-api-access-8c7tg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2mgkn\" (UID: \"4784f56a-332c-45b1-b121-ec925aece823\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.380473 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4784f56a-332c-45b1-b121-ec925aece823-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2mgkn\" (UID: \"4784f56a-332c-45b1-b121-ec925aece823\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.482829 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4784f56a-332c-45b1-b121-ec925aece823-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2mgkn\" (UID: \"4784f56a-332c-45b1-b121-ec925aece823\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.482915 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4784f56a-332c-45b1-b121-ec925aece823-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2mgkn\" (UID: \"4784f56a-332c-45b1-b121-ec925aece823\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.483003 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c7tg\" (UniqueName: \"kubernetes.io/projected/4784f56a-332c-45b1-b121-ec925aece823-kube-api-access-8c7tg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2mgkn\" (UID: \"4784f56a-332c-45b1-b121-ec925aece823\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.487846 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4784f56a-332c-45b1-b121-ec925aece823-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2mgkn\" (UID: \"4784f56a-332c-45b1-b121-ec925aece823\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.496466 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4784f56a-332c-45b1-b121-ec925aece823-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2mgkn\" (UID: \"4784f56a-332c-45b1-b121-ec925aece823\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.500795 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c7tg\" (UniqueName: \"kubernetes.io/projected/4784f56a-332c-45b1-b121-ec925aece823-kube-api-access-8c7tg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2mgkn\" (UID: \"4784f56a-332c-45b1-b121-ec925aece823\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" Feb 25 16:23:56 crc kubenswrapper[4937]: I0225 16:23:56.587453 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" Feb 25 16:23:57 crc kubenswrapper[4937]: I0225 16:23:57.193950 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn"] Feb 25 16:23:58 crc kubenswrapper[4937]: I0225 16:23:58.131397 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" event={"ID":"4784f56a-332c-45b1-b121-ec925aece823","Type":"ContainerStarted","Data":"1af04a4db165569b03a0092791ba8cbe9d09081dd690e83cf66c80c68c9efcc0"} Feb 25 16:23:58 crc kubenswrapper[4937]: I0225 16:23:58.131733 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" event={"ID":"4784f56a-332c-45b1-b121-ec925aece823","Type":"ContainerStarted","Data":"008f7047b22a285830e99644120f56cec58885abd30538604df6f3a02c36e0a3"} Feb 25 16:23:58 crc kubenswrapper[4937]: I0225 16:23:58.148922 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" podStartSLOduration=1.714927817 podStartE2EDuration="2.148900019s" podCreationTimestamp="2026-02-25 16:23:56 +0000 UTC" firstStartedPulling="2026-02-25 16:23:57.190871719 +0000 UTC m=+2288.204263629" lastFinishedPulling="2026-02-25 16:23:57.624843941 +0000 UTC m=+2288.638235831" observedRunningTime="2026-02-25 16:23:58.145970156 +0000 UTC m=+2289.159362046" watchObservedRunningTime="2026-02-25 16:23:58.148900019 +0000 UTC m=+2289.162291919" Feb 25 16:24:00 crc kubenswrapper[4937]: I0225 16:24:00.138272 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533944-z7jph"] Feb 25 16:24:00 crc kubenswrapper[4937]: I0225 16:24:00.140059 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533944-z7jph" Feb 25 16:24:00 crc kubenswrapper[4937]: I0225 16:24:00.181438 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:24:00 crc kubenswrapper[4937]: I0225 16:24:00.181963 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:24:00 crc kubenswrapper[4937]: I0225 16:24:00.182111 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:24:00 crc kubenswrapper[4937]: I0225 16:24:00.197109 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533944-z7jph"] Feb 25 16:24:00 crc kubenswrapper[4937]: I0225 16:24:00.282028 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5bhm\" (UniqueName: \"kubernetes.io/projected/0cdd70e5-860d-4b0c-9f36-5727ac62a4ba-kube-api-access-s5bhm\") pod \"auto-csr-approver-29533944-z7jph\" (UID: \"0cdd70e5-860d-4b0c-9f36-5727ac62a4ba\") " pod="openshift-infra/auto-csr-approver-29533944-z7jph" Feb 25 16:24:00 crc kubenswrapper[4937]: I0225 16:24:00.383787 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5bhm\" (UniqueName: \"kubernetes.io/projected/0cdd70e5-860d-4b0c-9f36-5727ac62a4ba-kube-api-access-s5bhm\") pod \"auto-csr-approver-29533944-z7jph\" (UID: \"0cdd70e5-860d-4b0c-9f36-5727ac62a4ba\") " pod="openshift-infra/auto-csr-approver-29533944-z7jph" Feb 25 16:24:00 crc kubenswrapper[4937]: I0225 16:24:00.415472 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5bhm\" (UniqueName: \"kubernetes.io/projected/0cdd70e5-860d-4b0c-9f36-5727ac62a4ba-kube-api-access-s5bhm\") pod \"auto-csr-approver-29533944-z7jph\" (UID: \"0cdd70e5-860d-4b0c-9f36-5727ac62a4ba\") " pod="openshift-infra/auto-csr-approver-29533944-z7jph" Feb 25 16:24:00 crc kubenswrapper[4937]: I0225 16:24:00.504775 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533944-z7jph" Feb 25 16:24:01 crc kubenswrapper[4937]: W0225 16:24:01.009978 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cdd70e5_860d_4b0c_9f36_5727ac62a4ba.slice/crio-2a17d08035477d550aa86e10458a9cb33bad60a88d5b63567634d98cbb11554a WatchSource:0}: Error finding container 2a17d08035477d550aa86e10458a9cb33bad60a88d5b63567634d98cbb11554a: Status 404 returned error can't find the container with id 2a17d08035477d550aa86e10458a9cb33bad60a88d5b63567634d98cbb11554a Feb 25 16:24:01 crc kubenswrapper[4937]: I0225 16:24:01.010403 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533944-z7jph"] Feb 25 16:24:01 crc kubenswrapper[4937]: I0225 16:24:01.201641 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533944-z7jph" event={"ID":"0cdd70e5-860d-4b0c-9f36-5727ac62a4ba","Type":"ContainerStarted","Data":"2a17d08035477d550aa86e10458a9cb33bad60a88d5b63567634d98cbb11554a"} Feb 25 16:24:03 crc kubenswrapper[4937]: I0225 16:24:03.225819 4937 generic.go:334] "Generic (PLEG): container finished" podID="0cdd70e5-860d-4b0c-9f36-5727ac62a4ba" containerID="2d6b61db3d154e32ceab1a3129ef56d78a5745b9e13feb2efffbcc615a14978d" exitCode=0 Feb 25 16:24:03 crc kubenswrapper[4937]: I0225 16:24:03.225945 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533944-z7jph" event={"ID":"0cdd70e5-860d-4b0c-9f36-5727ac62a4ba","Type":"ContainerDied","Data":"2d6b61db3d154e32ceab1a3129ef56d78a5745b9e13feb2efffbcc615a14978d"} Feb 25 16:24:04 crc kubenswrapper[4937]: I0225 16:24:04.689472 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533944-z7jph" Feb 25 16:24:04 crc kubenswrapper[4937]: I0225 16:24:04.881326 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5bhm\" (UniqueName: \"kubernetes.io/projected/0cdd70e5-860d-4b0c-9f36-5727ac62a4ba-kube-api-access-s5bhm\") pod \"0cdd70e5-860d-4b0c-9f36-5727ac62a4ba\" (UID: \"0cdd70e5-860d-4b0c-9f36-5727ac62a4ba\") " Feb 25 16:24:04 crc kubenswrapper[4937]: I0225 16:24:04.887353 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cdd70e5-860d-4b0c-9f36-5727ac62a4ba-kube-api-access-s5bhm" (OuterVolumeSpecName: "kube-api-access-s5bhm") pod "0cdd70e5-860d-4b0c-9f36-5727ac62a4ba" (UID: "0cdd70e5-860d-4b0c-9f36-5727ac62a4ba"). InnerVolumeSpecName "kube-api-access-s5bhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:24:04 crc kubenswrapper[4937]: I0225 16:24:04.983590 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5bhm\" (UniqueName: \"kubernetes.io/projected/0cdd70e5-860d-4b0c-9f36-5727ac62a4ba-kube-api-access-s5bhm\") on node \"crc\" DevicePath \"\"" Feb 25 16:24:05 crc kubenswrapper[4937]: I0225 16:24:05.244716 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533944-z7jph" event={"ID":"0cdd70e5-860d-4b0c-9f36-5727ac62a4ba","Type":"ContainerDied","Data":"2a17d08035477d550aa86e10458a9cb33bad60a88d5b63567634d98cbb11554a"} Feb 25 16:24:05 crc kubenswrapper[4937]: I0225 16:24:05.244757 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a17d08035477d550aa86e10458a9cb33bad60a88d5b63567634d98cbb11554a" Feb 25 16:24:05 crc kubenswrapper[4937]: I0225 16:24:05.244765 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533944-z7jph" Feb 25 16:24:05 crc kubenswrapper[4937]: I0225 16:24:05.758562 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533938-bhrf7"] Feb 25 16:24:05 crc kubenswrapper[4937]: I0225 16:24:05.769353 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533938-bhrf7"] Feb 25 16:24:07 crc kubenswrapper[4937]: I0225 16:24:07.381261 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="947fb65f-cfe3-411c-9ebd-4f89480703e0" path="/var/lib/kubelet/pods/947fb65f-cfe3-411c-9ebd-4f89480703e0/volumes" Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.642975 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d56nr"] Feb 25 16:24:11 crc kubenswrapper[4937]: E0225 16:24:11.644319 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cdd70e5-860d-4b0c-9f36-5727ac62a4ba" containerName="oc" Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.644341 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cdd70e5-860d-4b0c-9f36-5727ac62a4ba" containerName="oc" Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.644826 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cdd70e5-860d-4b0c-9f36-5727ac62a4ba" containerName="oc" Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.647764 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.652773 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d56nr"] Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.745209 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm9v2\" (UniqueName: \"kubernetes.io/projected/1ac265df-bc4f-4df2-b70b-6f300681009b-kube-api-access-hm9v2\") pod \"redhat-operators-d56nr\" (UID: \"1ac265df-bc4f-4df2-b70b-6f300681009b\") " pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.745496 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ac265df-bc4f-4df2-b70b-6f300681009b-utilities\") pod \"redhat-operators-d56nr\" (UID: \"1ac265df-bc4f-4df2-b70b-6f300681009b\") " pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.745639 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ac265df-bc4f-4df2-b70b-6f300681009b-catalog-content\") pod \"redhat-operators-d56nr\" (UID: \"1ac265df-bc4f-4df2-b70b-6f300681009b\") " pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.847556 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ac265df-bc4f-4df2-b70b-6f300681009b-catalog-content\") pod \"redhat-operators-d56nr\" (UID: \"1ac265df-bc4f-4df2-b70b-6f300681009b\") " pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.847678 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm9v2\" (UniqueName: \"kubernetes.io/projected/1ac265df-bc4f-4df2-b70b-6f300681009b-kube-api-access-hm9v2\") pod \"redhat-operators-d56nr\" (UID: \"1ac265df-bc4f-4df2-b70b-6f300681009b\") " pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.847700 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ac265df-bc4f-4df2-b70b-6f300681009b-utilities\") pod \"redhat-operators-d56nr\" (UID: \"1ac265df-bc4f-4df2-b70b-6f300681009b\") " pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.848184 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ac265df-bc4f-4df2-b70b-6f300681009b-utilities\") pod \"redhat-operators-d56nr\" (UID: \"1ac265df-bc4f-4df2-b70b-6f300681009b\") " pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.848369 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ac265df-bc4f-4df2-b70b-6f300681009b-catalog-content\") pod \"redhat-operators-d56nr\" (UID: \"1ac265df-bc4f-4df2-b70b-6f300681009b\") " pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.871860 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm9v2\" (UniqueName: \"kubernetes.io/projected/1ac265df-bc4f-4df2-b70b-6f300681009b-kube-api-access-hm9v2\") pod \"redhat-operators-d56nr\" (UID: \"1ac265df-bc4f-4df2-b70b-6f300681009b\") " pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:11 crc kubenswrapper[4937]: I0225 16:24:11.982278 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:12 crc kubenswrapper[4937]: I0225 16:24:12.498281 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d56nr"] Feb 25 16:24:12 crc kubenswrapper[4937]: W0225 16:24:12.507546 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ac265df_bc4f_4df2_b70b_6f300681009b.slice/crio-b7602f77783cf08ce1c080886c8c1646aed6b304249a2198dbf88be05eb218fc WatchSource:0}: Error finding container b7602f77783cf08ce1c080886c8c1646aed6b304249a2198dbf88be05eb218fc: Status 404 returned error can't find the container with id b7602f77783cf08ce1c080886c8c1646aed6b304249a2198dbf88be05eb218fc Feb 25 16:24:13 crc kubenswrapper[4937]: I0225 16:24:13.331631 4937 generic.go:334] "Generic (PLEG): container finished" podID="1ac265df-bc4f-4df2-b70b-6f300681009b" containerID="3a0912f9706625a067b0134ce01a072e3fbec9f5d0e6a13f7f1f180df8b09441" exitCode=0 Feb 25 16:24:13 crc kubenswrapper[4937]: I0225 16:24:13.331740 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d56nr" event={"ID":"1ac265df-bc4f-4df2-b70b-6f300681009b","Type":"ContainerDied","Data":"3a0912f9706625a067b0134ce01a072e3fbec9f5d0e6a13f7f1f180df8b09441"} Feb 25 16:24:13 crc kubenswrapper[4937]: I0225 16:24:13.332171 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d56nr" event={"ID":"1ac265df-bc4f-4df2-b70b-6f300681009b","Type":"ContainerStarted","Data":"b7602f77783cf08ce1c080886c8c1646aed6b304249a2198dbf88be05eb218fc"} Feb 25 16:24:14 crc kubenswrapper[4937]: I0225 16:24:14.342589 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d56nr" event={"ID":"1ac265df-bc4f-4df2-b70b-6f300681009b","Type":"ContainerStarted","Data":"0a33bd7fcbf9e7eba4638ac40874f67fb6f53b53d4fd63f720a065e450f8cc15"} Feb 25 16:24:19 crc kubenswrapper[4937]: I0225 16:24:19.040395 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ln9pv"] Feb 25 16:24:19 crc kubenswrapper[4937]: I0225 16:24:19.052991 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ln9pv"] Feb 25 16:24:19 crc kubenswrapper[4937]: I0225 16:24:19.383950 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90c2ee96-d6a2-4231-abc4-e9e186375ede" path="/var/lib/kubelet/pods/90c2ee96-d6a2-4231-abc4-e9e186375ede/volumes" Feb 25 16:24:22 crc kubenswrapper[4937]: I0225 16:24:22.422036 4937 generic.go:334] "Generic (PLEG): container finished" podID="1ac265df-bc4f-4df2-b70b-6f300681009b" containerID="0a33bd7fcbf9e7eba4638ac40874f67fb6f53b53d4fd63f720a065e450f8cc15" exitCode=0 Feb 25 16:24:22 crc kubenswrapper[4937]: I0225 16:24:22.422116 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d56nr" event={"ID":"1ac265df-bc4f-4df2-b70b-6f300681009b","Type":"ContainerDied","Data":"0a33bd7fcbf9e7eba4638ac40874f67fb6f53b53d4fd63f720a065e450f8cc15"} Feb 25 16:24:23 crc kubenswrapper[4937]: I0225 16:24:23.446941 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d56nr" event={"ID":"1ac265df-bc4f-4df2-b70b-6f300681009b","Type":"ContainerStarted","Data":"6b991fe9c69bca31eb5eb26107ec28f03d52b2feda52a09fea0171ca5ac53ce7"} Feb 25 16:24:23 crc kubenswrapper[4937]: I0225 16:24:23.475465 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d56nr" podStartSLOduration=2.971855957 podStartE2EDuration="12.475447841s" podCreationTimestamp="2026-02-25 16:24:11 +0000 UTC" firstStartedPulling="2026-02-25 16:24:13.333811162 +0000 UTC m=+2304.347203052" lastFinishedPulling="2026-02-25 16:24:22.837403046 +0000 UTC m=+2313.850794936" observedRunningTime="2026-02-25 16:24:23.467074651 +0000 UTC m=+2314.480466541" watchObservedRunningTime="2026-02-25 16:24:23.475447841 +0000 UTC m=+2314.488839731" Feb 25 16:24:31 crc kubenswrapper[4937]: I0225 16:24:31.982827 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:31 crc kubenswrapper[4937]: I0225 16:24:31.983415 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:32 crc kubenswrapper[4937]: I0225 16:24:32.040778 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:32 crc kubenswrapper[4937]: I0225 16:24:32.558694 4937 generic.go:334] "Generic (PLEG): container finished" podID="4784f56a-332c-45b1-b121-ec925aece823" containerID="1af04a4db165569b03a0092791ba8cbe9d09081dd690e83cf66c80c68c9efcc0" exitCode=0 Feb 25 16:24:32 crc kubenswrapper[4937]: I0225 16:24:32.558762 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" event={"ID":"4784f56a-332c-45b1-b121-ec925aece823","Type":"ContainerDied","Data":"1af04a4db165569b03a0092791ba8cbe9d09081dd690e83cf66c80c68c9efcc0"} Feb 25 16:24:32 crc kubenswrapper[4937]: I0225 16:24:32.612675 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:32 crc kubenswrapper[4937]: I0225 16:24:32.662686 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d56nr"] Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.088509 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.214633 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c7tg\" (UniqueName: \"kubernetes.io/projected/4784f56a-332c-45b1-b121-ec925aece823-kube-api-access-8c7tg\") pod \"4784f56a-332c-45b1-b121-ec925aece823\" (UID: \"4784f56a-332c-45b1-b121-ec925aece823\") " Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.214748 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4784f56a-332c-45b1-b121-ec925aece823-ssh-key-openstack-edpm-ipam\") pod \"4784f56a-332c-45b1-b121-ec925aece823\" (UID: \"4784f56a-332c-45b1-b121-ec925aece823\") " Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.214785 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4784f56a-332c-45b1-b121-ec925aece823-inventory\") pod \"4784f56a-332c-45b1-b121-ec925aece823\" (UID: \"4784f56a-332c-45b1-b121-ec925aece823\") " Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.220758 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4784f56a-332c-45b1-b121-ec925aece823-kube-api-access-8c7tg" (OuterVolumeSpecName: "kube-api-access-8c7tg") pod "4784f56a-332c-45b1-b121-ec925aece823" (UID: "4784f56a-332c-45b1-b121-ec925aece823"). InnerVolumeSpecName "kube-api-access-8c7tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.247839 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4784f56a-332c-45b1-b121-ec925aece823-inventory" (OuterVolumeSpecName: "inventory") pod "4784f56a-332c-45b1-b121-ec925aece823" (UID: "4784f56a-332c-45b1-b121-ec925aece823"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.249705 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4784f56a-332c-45b1-b121-ec925aece823-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4784f56a-332c-45b1-b121-ec925aece823" (UID: "4784f56a-332c-45b1-b121-ec925aece823"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.317754 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c7tg\" (UniqueName: \"kubernetes.io/projected/4784f56a-332c-45b1-b121-ec925aece823-kube-api-access-8c7tg\") on node \"crc\" DevicePath \"\"" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.317791 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4784f56a-332c-45b1-b121-ec925aece823-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.317803 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4784f56a-332c-45b1-b121-ec925aece823-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.588854 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" event={"ID":"4784f56a-332c-45b1-b121-ec925aece823","Type":"ContainerDied","Data":"008f7047b22a285830e99644120f56cec58885abd30538604df6f3a02c36e0a3"} Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.588907 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="008f7047b22a285830e99644120f56cec58885abd30538604df6f3a02c36e0a3" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.588973 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2mgkn" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.588959 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d56nr" podUID="1ac265df-bc4f-4df2-b70b-6f300681009b" containerName="registry-server" containerID="cri-o://6b991fe9c69bca31eb5eb26107ec28f03d52b2feda52a09fea0171ca5ac53ce7" gracePeriod=2 Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.678556 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp"] Feb 25 16:24:34 crc kubenswrapper[4937]: E0225 16:24:34.679015 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4784f56a-332c-45b1-b121-ec925aece823" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.679032 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4784f56a-332c-45b1-b121-ec925aece823" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.679251 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="4784f56a-332c-45b1-b121-ec925aece823" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.679980 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.682930 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.683260 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.693040 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.693463 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.734272 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp"] Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.831707 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l75b\" (UniqueName: \"kubernetes.io/projected/1bd696e6-be36-4b9e-9f00-9ba293305842-kube-api-access-8l75b\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp\" (UID: \"1bd696e6-be36-4b9e-9f00-9ba293305842\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.832189 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bd696e6-be36-4b9e-9f00-9ba293305842-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp\" (UID: \"1bd696e6-be36-4b9e-9f00-9ba293305842\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.832297 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bd696e6-be36-4b9e-9f00-9ba293305842-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp\" (UID: \"1bd696e6-be36-4b9e-9f00-9ba293305842\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.933753 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bd696e6-be36-4b9e-9f00-9ba293305842-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp\" (UID: \"1bd696e6-be36-4b9e-9f00-9ba293305842\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.933868 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bd696e6-be36-4b9e-9f00-9ba293305842-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp\" (UID: \"1bd696e6-be36-4b9e-9f00-9ba293305842\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.933945 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l75b\" (UniqueName: \"kubernetes.io/projected/1bd696e6-be36-4b9e-9f00-9ba293305842-kube-api-access-8l75b\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp\" (UID: \"1bd696e6-be36-4b9e-9f00-9ba293305842\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.942825 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bd696e6-be36-4b9e-9f00-9ba293305842-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp\" (UID: \"1bd696e6-be36-4b9e-9f00-9ba293305842\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.943305 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bd696e6-be36-4b9e-9f00-9ba293305842-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp\" (UID: \"1bd696e6-be36-4b9e-9f00-9ba293305842\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" Feb 25 16:24:34 crc kubenswrapper[4937]: I0225 16:24:34.960364 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l75b\" (UniqueName: \"kubernetes.io/projected/1bd696e6-be36-4b9e-9f00-9ba293305842-kube-api-access-8l75b\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp\" (UID: \"1bd696e6-be36-4b9e-9f00-9ba293305842\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.121782 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.366758 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.451433 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ac265df-bc4f-4df2-b70b-6f300681009b-catalog-content\") pod \"1ac265df-bc4f-4df2-b70b-6f300681009b\" (UID: \"1ac265df-bc4f-4df2-b70b-6f300681009b\") " Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.451700 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm9v2\" (UniqueName: \"kubernetes.io/projected/1ac265df-bc4f-4df2-b70b-6f300681009b-kube-api-access-hm9v2\") pod \"1ac265df-bc4f-4df2-b70b-6f300681009b\" (UID: \"1ac265df-bc4f-4df2-b70b-6f300681009b\") " Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.451910 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ac265df-bc4f-4df2-b70b-6f300681009b-utilities\") pod \"1ac265df-bc4f-4df2-b70b-6f300681009b\" (UID: \"1ac265df-bc4f-4df2-b70b-6f300681009b\") " Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.452759 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ac265df-bc4f-4df2-b70b-6f300681009b-utilities" (OuterVolumeSpecName: "utilities") pod "1ac265df-bc4f-4df2-b70b-6f300681009b" (UID: "1ac265df-bc4f-4df2-b70b-6f300681009b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.453254 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ac265df-bc4f-4df2-b70b-6f300681009b-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.456892 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ac265df-bc4f-4df2-b70b-6f300681009b-kube-api-access-hm9v2" (OuterVolumeSpecName: "kube-api-access-hm9v2") pod "1ac265df-bc4f-4df2-b70b-6f300681009b" (UID: "1ac265df-bc4f-4df2-b70b-6f300681009b"). InnerVolumeSpecName "kube-api-access-hm9v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.564873 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm9v2\" (UniqueName: \"kubernetes.io/projected/1ac265df-bc4f-4df2-b70b-6f300681009b-kube-api-access-hm9v2\") on node \"crc\" DevicePath \"\"" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.586821 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ac265df-bc4f-4df2-b70b-6f300681009b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ac265df-bc4f-4df2-b70b-6f300681009b" (UID: "1ac265df-bc4f-4df2-b70b-6f300681009b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.609291 4937 generic.go:334] "Generic (PLEG): container finished" podID="1ac265df-bc4f-4df2-b70b-6f300681009b" containerID="6b991fe9c69bca31eb5eb26107ec28f03d52b2feda52a09fea0171ca5ac53ce7" exitCode=0 Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.609329 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d56nr" event={"ID":"1ac265df-bc4f-4df2-b70b-6f300681009b","Type":"ContainerDied","Data":"6b991fe9c69bca31eb5eb26107ec28f03d52b2feda52a09fea0171ca5ac53ce7"} Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.609354 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d56nr" event={"ID":"1ac265df-bc4f-4df2-b70b-6f300681009b","Type":"ContainerDied","Data":"b7602f77783cf08ce1c080886c8c1646aed6b304249a2198dbf88be05eb218fc"} Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.609375 4937 scope.go:117] "RemoveContainer" containerID="6b991fe9c69bca31eb5eb26107ec28f03d52b2feda52a09fea0171ca5ac53ce7" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.609528 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d56nr" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.632815 4937 scope.go:117] "RemoveContainer" containerID="0a33bd7fcbf9e7eba4638ac40874f67fb6f53b53d4fd63f720a065e450f8cc15" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.655805 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d56nr"] Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.664839 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d56nr"] Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.666838 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ac265df-bc4f-4df2-b70b-6f300681009b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.674710 4937 scope.go:117] "RemoveContainer" containerID="3a0912f9706625a067b0134ce01a072e3fbec9f5d0e6a13f7f1f180df8b09441" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.702558 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp"] Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.707130 4937 scope.go:117] "RemoveContainer" containerID="6b991fe9c69bca31eb5eb26107ec28f03d52b2feda52a09fea0171ca5ac53ce7" Feb 25 16:24:35 crc kubenswrapper[4937]: E0225 16:24:35.707530 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b991fe9c69bca31eb5eb26107ec28f03d52b2feda52a09fea0171ca5ac53ce7\": container with ID starting with 6b991fe9c69bca31eb5eb26107ec28f03d52b2feda52a09fea0171ca5ac53ce7 not found: ID does not exist" containerID="6b991fe9c69bca31eb5eb26107ec28f03d52b2feda52a09fea0171ca5ac53ce7" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.707567 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b991fe9c69bca31eb5eb26107ec28f03d52b2feda52a09fea0171ca5ac53ce7"} err="failed to get container status \"6b991fe9c69bca31eb5eb26107ec28f03d52b2feda52a09fea0171ca5ac53ce7\": rpc error: code = NotFound desc = could not find container \"6b991fe9c69bca31eb5eb26107ec28f03d52b2feda52a09fea0171ca5ac53ce7\": container with ID starting with 6b991fe9c69bca31eb5eb26107ec28f03d52b2feda52a09fea0171ca5ac53ce7 not found: ID does not exist" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.707593 4937 scope.go:117] "RemoveContainer" containerID="0a33bd7fcbf9e7eba4638ac40874f67fb6f53b53d4fd63f720a065e450f8cc15" Feb 25 16:24:35 crc kubenswrapper[4937]: E0225 16:24:35.708064 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a33bd7fcbf9e7eba4638ac40874f67fb6f53b53d4fd63f720a065e450f8cc15\": container with ID starting with 0a33bd7fcbf9e7eba4638ac40874f67fb6f53b53d4fd63f720a065e450f8cc15 not found: ID does not exist" containerID="0a33bd7fcbf9e7eba4638ac40874f67fb6f53b53d4fd63f720a065e450f8cc15" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.708104 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a33bd7fcbf9e7eba4638ac40874f67fb6f53b53d4fd63f720a065e450f8cc15"} err="failed to get container status \"0a33bd7fcbf9e7eba4638ac40874f67fb6f53b53d4fd63f720a065e450f8cc15\": rpc error: code = NotFound desc = could not find container \"0a33bd7fcbf9e7eba4638ac40874f67fb6f53b53d4fd63f720a065e450f8cc15\": container with ID starting with 0a33bd7fcbf9e7eba4638ac40874f67fb6f53b53d4fd63f720a065e450f8cc15 not found: ID does not exist" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.708130 4937 scope.go:117] "RemoveContainer" containerID="3a0912f9706625a067b0134ce01a072e3fbec9f5d0e6a13f7f1f180df8b09441" Feb 25 16:24:35 crc kubenswrapper[4937]: E0225 16:24:35.708404 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a0912f9706625a067b0134ce01a072e3fbec9f5d0e6a13f7f1f180df8b09441\": container with ID starting with 3a0912f9706625a067b0134ce01a072e3fbec9f5d0e6a13f7f1f180df8b09441 not found: ID does not exist" containerID="3a0912f9706625a067b0134ce01a072e3fbec9f5d0e6a13f7f1f180df8b09441" Feb 25 16:24:35 crc kubenswrapper[4937]: I0225 16:24:35.708439 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a0912f9706625a067b0134ce01a072e3fbec9f5d0e6a13f7f1f180df8b09441"} err="failed to get container status \"3a0912f9706625a067b0134ce01a072e3fbec9f5d0e6a13f7f1f180df8b09441\": rpc error: code = NotFound desc = could not find container \"3a0912f9706625a067b0134ce01a072e3fbec9f5d0e6a13f7f1f180df8b09441\": container with ID starting with 3a0912f9706625a067b0134ce01a072e3fbec9f5d0e6a13f7f1f180df8b09441 not found: ID does not exist" Feb 25 16:24:36 crc kubenswrapper[4937]: I0225 16:24:36.620389 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" event={"ID":"1bd696e6-be36-4b9e-9f00-9ba293305842","Type":"ContainerStarted","Data":"1430019dc9c1f13d949d2b533292410650d5249f7fbf72c7167a385561b8583c"} Feb 25 16:24:36 crc kubenswrapper[4937]: I0225 16:24:36.620773 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" event={"ID":"1bd696e6-be36-4b9e-9f00-9ba293305842","Type":"ContainerStarted","Data":"abc55d99185d868c12d71abfa748b66be13df23001384bbc9145bbee77e60027"} Feb 25 16:24:36 crc kubenswrapper[4937]: I0225 16:24:36.646966 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" podStartSLOduration=2.119769198 podStartE2EDuration="2.646947325s" podCreationTimestamp="2026-02-25 16:24:34 +0000 UTC" firstStartedPulling="2026-02-25 16:24:35.714585787 +0000 UTC m=+2326.727977687" lastFinishedPulling="2026-02-25 16:24:36.241763884 +0000 UTC m=+2327.255155814" observedRunningTime="2026-02-25 16:24:36.63636862 +0000 UTC m=+2327.649760520" watchObservedRunningTime="2026-02-25 16:24:36.646947325 +0000 UTC m=+2327.660339215" Feb 25 16:24:37 crc kubenswrapper[4937]: I0225 16:24:37.380958 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ac265df-bc4f-4df2-b70b-6f300681009b" path="/var/lib/kubelet/pods/1ac265df-bc4f-4df2-b70b-6f300681009b/volumes" Feb 25 16:24:40 crc kubenswrapper[4937]: I0225 16:24:40.564851 4937 scope.go:117] "RemoveContainer" containerID="7f22c4ab562bfd37ac3a88a1518f26c12f915580c0e99b5e1d8cae72324a64de" Feb 25 16:24:40 crc kubenswrapper[4937]: I0225 16:24:40.620942 4937 scope.go:117] "RemoveContainer" containerID="4e633c736d9b0cd964a0d626c3ae05a3f211a5d2b78c41a8d01cc2d9ca52193b" Feb 25 16:24:41 crc kubenswrapper[4937]: I0225 16:24:41.494470 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:24:41 crc kubenswrapper[4937]: I0225 16:24:41.494545 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:24:50 crc kubenswrapper[4937]: I0225 16:24:50.049300 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-9ws68"] Feb 25 16:24:50 crc kubenswrapper[4937]: I0225 16:24:50.061817 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-9ws68"] Feb 25 16:24:51 crc kubenswrapper[4937]: I0225 16:24:51.382768 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28" path="/var/lib/kubelet/pods/4c6f9ce7-5b2c-43d9-bb6e-683202fbeb28/volumes" Feb 25 16:24:52 crc kubenswrapper[4937]: I0225 16:24:52.043893 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4sssx"] Feb 25 16:24:52 crc kubenswrapper[4937]: I0225 16:24:52.060279 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4sssx"] Feb 25 16:24:53 crc kubenswrapper[4937]: I0225 16:24:53.385245 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a2ea03f-b76a-4775-b4ee-827cc43744c1" path="/var/lib/kubelet/pods/8a2ea03f-b76a-4775-b4ee-827cc43744c1/volumes" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.040647 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-27xfk"] Feb 25 16:25:07 crc kubenswrapper[4937]: E0225 16:25:07.041865 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac265df-bc4f-4df2-b70b-6f300681009b" containerName="extract-content" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.041887 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac265df-bc4f-4df2-b70b-6f300681009b" containerName="extract-content" Feb 25 16:25:07 crc kubenswrapper[4937]: E0225 16:25:07.041913 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac265df-bc4f-4df2-b70b-6f300681009b" containerName="extract-utilities" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.041925 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac265df-bc4f-4df2-b70b-6f300681009b" containerName="extract-utilities" Feb 25 16:25:07 crc kubenswrapper[4937]: E0225 16:25:07.041941 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac265df-bc4f-4df2-b70b-6f300681009b" containerName="registry-server" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.041952 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac265df-bc4f-4df2-b70b-6f300681009b" containerName="registry-server" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.042300 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac265df-bc4f-4df2-b70b-6f300681009b" containerName="registry-server" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.044772 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.063215 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27xfk"] Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.151171 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nh8c\" (UniqueName: \"kubernetes.io/projected/cc348e39-415e-461a-816e-478bae8b63a1-kube-api-access-2nh8c\") pod \"redhat-marketplace-27xfk\" (UID: \"cc348e39-415e-461a-816e-478bae8b63a1\") " pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.151355 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc348e39-415e-461a-816e-478bae8b63a1-utilities\") pod \"redhat-marketplace-27xfk\" (UID: \"cc348e39-415e-461a-816e-478bae8b63a1\") " pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.151706 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc348e39-415e-461a-816e-478bae8b63a1-catalog-content\") pod \"redhat-marketplace-27xfk\" (UID: \"cc348e39-415e-461a-816e-478bae8b63a1\") " pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.254310 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nh8c\" (UniqueName: \"kubernetes.io/projected/cc348e39-415e-461a-816e-478bae8b63a1-kube-api-access-2nh8c\") pod \"redhat-marketplace-27xfk\" (UID: \"cc348e39-415e-461a-816e-478bae8b63a1\") " pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.254805 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc348e39-415e-461a-816e-478bae8b63a1-utilities\") pod \"redhat-marketplace-27xfk\" (UID: \"cc348e39-415e-461a-816e-478bae8b63a1\") " pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.254900 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc348e39-415e-461a-816e-478bae8b63a1-catalog-content\") pod \"redhat-marketplace-27xfk\" (UID: \"cc348e39-415e-461a-816e-478bae8b63a1\") " pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.255587 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc348e39-415e-461a-816e-478bae8b63a1-catalog-content\") pod \"redhat-marketplace-27xfk\" (UID: \"cc348e39-415e-461a-816e-478bae8b63a1\") " pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.255718 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc348e39-415e-461a-816e-478bae8b63a1-utilities\") pod \"redhat-marketplace-27xfk\" (UID: \"cc348e39-415e-461a-816e-478bae8b63a1\") " pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.282130 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nh8c\" (UniqueName: \"kubernetes.io/projected/cc348e39-415e-461a-816e-478bae8b63a1-kube-api-access-2nh8c\") pod \"redhat-marketplace-27xfk\" (UID: \"cc348e39-415e-461a-816e-478bae8b63a1\") " pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.382258 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:07 crc kubenswrapper[4937]: I0225 16:25:07.963950 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27xfk"] Feb 25 16:25:08 crc kubenswrapper[4937]: I0225 16:25:08.930227 4937 generic.go:334] "Generic (PLEG): container finished" podID="cc348e39-415e-461a-816e-478bae8b63a1" containerID="3dda86ba5dd62344fe12c251b0afbc5d00f221b7425d0a50b8587b05befe8ffe" exitCode=0 Feb 25 16:25:08 crc kubenswrapper[4937]: I0225 16:25:08.930416 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27xfk" event={"ID":"cc348e39-415e-461a-816e-478bae8b63a1","Type":"ContainerDied","Data":"3dda86ba5dd62344fe12c251b0afbc5d00f221b7425d0a50b8587b05befe8ffe"} Feb 25 16:25:08 crc kubenswrapper[4937]: I0225 16:25:08.930799 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27xfk" event={"ID":"cc348e39-415e-461a-816e-478bae8b63a1","Type":"ContainerStarted","Data":"c6a34ac4530e4ebecd2557221eb6bc0e2b9c799dbef217a072e9f86cfe2964a9"} Feb 25 16:25:10 crc kubenswrapper[4937]: I0225 16:25:10.949270 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27xfk" event={"ID":"cc348e39-415e-461a-816e-478bae8b63a1","Type":"ContainerStarted","Data":"cd92aed0d549c7759f42c6a10752d1137f04e9fbc08b98b33365506ff45e193a"} Feb 25 16:25:11 crc kubenswrapper[4937]: I0225 16:25:11.494630 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:25:11 crc kubenswrapper[4937]: I0225 16:25:11.494679 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:25:11 crc kubenswrapper[4937]: I0225 16:25:11.963167 4937 generic.go:334] "Generic (PLEG): container finished" podID="cc348e39-415e-461a-816e-478bae8b63a1" containerID="cd92aed0d549c7759f42c6a10752d1137f04e9fbc08b98b33365506ff45e193a" exitCode=0 Feb 25 16:25:11 crc kubenswrapper[4937]: I0225 16:25:11.963263 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27xfk" event={"ID":"cc348e39-415e-461a-816e-478bae8b63a1","Type":"ContainerDied","Data":"cd92aed0d549c7759f42c6a10752d1137f04e9fbc08b98b33365506ff45e193a"} Feb 25 16:25:12 crc kubenswrapper[4937]: I0225 16:25:12.979319 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27xfk" event={"ID":"cc348e39-415e-461a-816e-478bae8b63a1","Type":"ContainerStarted","Data":"828123248ac922f9910089df84be7e90e31bfb0fd7974b30b21a82dcae3f2cf7"} Feb 25 16:25:13 crc kubenswrapper[4937]: I0225 16:25:12.999692 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-27xfk" podStartSLOduration=2.36187782 podStartE2EDuration="5.999669375s" podCreationTimestamp="2026-02-25 16:25:07 +0000 UTC" firstStartedPulling="2026-02-25 16:25:08.932924274 +0000 UTC m=+2359.946316164" lastFinishedPulling="2026-02-25 16:25:12.570715829 +0000 UTC m=+2363.584107719" observedRunningTime="2026-02-25 16:25:12.996940567 +0000 UTC m=+2364.010332467" watchObservedRunningTime="2026-02-25 16:25:12.999669375 +0000 UTC m=+2364.013061265" Feb 25 16:25:17 crc kubenswrapper[4937]: I0225 16:25:17.387439 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:17 crc kubenswrapper[4937]: I0225 16:25:17.387787 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:17 crc kubenswrapper[4937]: I0225 16:25:17.443926 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:18 crc kubenswrapper[4937]: I0225 16:25:18.078359 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:18 crc kubenswrapper[4937]: I0225 16:25:18.144114 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27xfk"] Feb 25 16:25:20 crc kubenswrapper[4937]: I0225 16:25:20.051246 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-27xfk" podUID="cc348e39-415e-461a-816e-478bae8b63a1" containerName="registry-server" containerID="cri-o://828123248ac922f9910089df84be7e90e31bfb0fd7974b30b21a82dcae3f2cf7" gracePeriod=2 Feb 25 16:25:20 crc kubenswrapper[4937]: I0225 16:25:20.737667 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:20 crc kubenswrapper[4937]: I0225 16:25:20.889993 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc348e39-415e-461a-816e-478bae8b63a1-utilities\") pod \"cc348e39-415e-461a-816e-478bae8b63a1\" (UID: \"cc348e39-415e-461a-816e-478bae8b63a1\") " Feb 25 16:25:20 crc kubenswrapper[4937]: I0225 16:25:20.890146 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nh8c\" (UniqueName: \"kubernetes.io/projected/cc348e39-415e-461a-816e-478bae8b63a1-kube-api-access-2nh8c\") pod \"cc348e39-415e-461a-816e-478bae8b63a1\" (UID: \"cc348e39-415e-461a-816e-478bae8b63a1\") " Feb 25 16:25:20 crc kubenswrapper[4937]: I0225 16:25:20.890249 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc348e39-415e-461a-816e-478bae8b63a1-catalog-content\") pod \"cc348e39-415e-461a-816e-478bae8b63a1\" (UID: \"cc348e39-415e-461a-816e-478bae8b63a1\") " Feb 25 16:25:20 crc kubenswrapper[4937]: I0225 16:25:20.891200 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc348e39-415e-461a-816e-478bae8b63a1-utilities" (OuterVolumeSpecName: "utilities") pod "cc348e39-415e-461a-816e-478bae8b63a1" (UID: "cc348e39-415e-461a-816e-478bae8b63a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:25:20 crc kubenswrapper[4937]: I0225 16:25:20.901202 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc348e39-415e-461a-816e-478bae8b63a1-kube-api-access-2nh8c" (OuterVolumeSpecName: "kube-api-access-2nh8c") pod "cc348e39-415e-461a-816e-478bae8b63a1" (UID: "cc348e39-415e-461a-816e-478bae8b63a1"). InnerVolumeSpecName "kube-api-access-2nh8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:25:20 crc kubenswrapper[4937]: I0225 16:25:20.992826 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nh8c\" (UniqueName: \"kubernetes.io/projected/cc348e39-415e-461a-816e-478bae8b63a1-kube-api-access-2nh8c\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:20 crc kubenswrapper[4937]: I0225 16:25:20.992875 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc348e39-415e-461a-816e-478bae8b63a1-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.066363 4937 generic.go:334] "Generic (PLEG): container finished" podID="cc348e39-415e-461a-816e-478bae8b63a1" containerID="828123248ac922f9910089df84be7e90e31bfb0fd7974b30b21a82dcae3f2cf7" exitCode=0 Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.066420 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27xfk" event={"ID":"cc348e39-415e-461a-816e-478bae8b63a1","Type":"ContainerDied","Data":"828123248ac922f9910089df84be7e90e31bfb0fd7974b30b21a82dcae3f2cf7"} Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.066454 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27xfk" event={"ID":"cc348e39-415e-461a-816e-478bae8b63a1","Type":"ContainerDied","Data":"c6a34ac4530e4ebecd2557221eb6bc0e2b9c799dbef217a072e9f86cfe2964a9"} Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.066478 4937 scope.go:117] "RemoveContainer" containerID="828123248ac922f9910089df84be7e90e31bfb0fd7974b30b21a82dcae3f2cf7" Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.066620 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27xfk" Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.107815 4937 scope.go:117] "RemoveContainer" containerID="cd92aed0d549c7759f42c6a10752d1137f04e9fbc08b98b33365506ff45e193a" Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.118426 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc348e39-415e-461a-816e-478bae8b63a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc348e39-415e-461a-816e-478bae8b63a1" (UID: "cc348e39-415e-461a-816e-478bae8b63a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.151810 4937 scope.go:117] "RemoveContainer" containerID="3dda86ba5dd62344fe12c251b0afbc5d00f221b7425d0a50b8587b05befe8ffe" Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.194037 4937 scope.go:117] "RemoveContainer" containerID="828123248ac922f9910089df84be7e90e31bfb0fd7974b30b21a82dcae3f2cf7" Feb 25 16:25:21 crc kubenswrapper[4937]: E0225 16:25:21.194464 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"828123248ac922f9910089df84be7e90e31bfb0fd7974b30b21a82dcae3f2cf7\": container with ID starting with 828123248ac922f9910089df84be7e90e31bfb0fd7974b30b21a82dcae3f2cf7 not found: ID does not exist" containerID="828123248ac922f9910089df84be7e90e31bfb0fd7974b30b21a82dcae3f2cf7" Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.194507 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"828123248ac922f9910089df84be7e90e31bfb0fd7974b30b21a82dcae3f2cf7"} err="failed to get container status \"828123248ac922f9910089df84be7e90e31bfb0fd7974b30b21a82dcae3f2cf7\": rpc error: code = NotFound desc = could not find container \"828123248ac922f9910089df84be7e90e31bfb0fd7974b30b21a82dcae3f2cf7\": container with ID starting with 828123248ac922f9910089df84be7e90e31bfb0fd7974b30b21a82dcae3f2cf7 not found: ID does not exist" Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.194528 4937 scope.go:117] "RemoveContainer" containerID="cd92aed0d549c7759f42c6a10752d1137f04e9fbc08b98b33365506ff45e193a" Feb 25 16:25:21 crc kubenswrapper[4937]: E0225 16:25:21.194894 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd92aed0d549c7759f42c6a10752d1137f04e9fbc08b98b33365506ff45e193a\": container with ID starting with cd92aed0d549c7759f42c6a10752d1137f04e9fbc08b98b33365506ff45e193a not found: ID does not exist" containerID="cd92aed0d549c7759f42c6a10752d1137f04e9fbc08b98b33365506ff45e193a" Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.194916 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd92aed0d549c7759f42c6a10752d1137f04e9fbc08b98b33365506ff45e193a"} err="failed to get container status \"cd92aed0d549c7759f42c6a10752d1137f04e9fbc08b98b33365506ff45e193a\": rpc error: code = NotFound desc = could not find container \"cd92aed0d549c7759f42c6a10752d1137f04e9fbc08b98b33365506ff45e193a\": container with ID starting with cd92aed0d549c7759f42c6a10752d1137f04e9fbc08b98b33365506ff45e193a not found: ID does not exist" Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.194929 4937 scope.go:117] "RemoveContainer" containerID="3dda86ba5dd62344fe12c251b0afbc5d00f221b7425d0a50b8587b05befe8ffe" Feb 25 16:25:21 crc kubenswrapper[4937]: E0225 16:25:21.195344 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dda86ba5dd62344fe12c251b0afbc5d00f221b7425d0a50b8587b05befe8ffe\": container with ID starting with 3dda86ba5dd62344fe12c251b0afbc5d00f221b7425d0a50b8587b05befe8ffe not found: ID does not exist" containerID="3dda86ba5dd62344fe12c251b0afbc5d00f221b7425d0a50b8587b05befe8ffe" Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.195391 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dda86ba5dd62344fe12c251b0afbc5d00f221b7425d0a50b8587b05befe8ffe"} err="failed to get container status \"3dda86ba5dd62344fe12c251b0afbc5d00f221b7425d0a50b8587b05befe8ffe\": rpc error: code = NotFound desc = could not find container \"3dda86ba5dd62344fe12c251b0afbc5d00f221b7425d0a50b8587b05befe8ffe\": container with ID starting with 3dda86ba5dd62344fe12c251b0afbc5d00f221b7425d0a50b8587b05befe8ffe not found: ID does not exist" Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.195975 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc348e39-415e-461a-816e-478bae8b63a1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.411133 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27xfk"] Feb 25 16:25:21 crc kubenswrapper[4937]: I0225 16:25:21.420795 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-27xfk"] Feb 25 16:25:23 crc kubenswrapper[4937]: I0225 16:25:23.379753 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc348e39-415e-461a-816e-478bae8b63a1" path="/var/lib/kubelet/pods/cc348e39-415e-461a-816e-478bae8b63a1/volumes" Feb 25 16:25:24 crc kubenswrapper[4937]: I0225 16:25:24.114458 4937 generic.go:334] "Generic (PLEG): container finished" podID="1bd696e6-be36-4b9e-9f00-9ba293305842" containerID="1430019dc9c1f13d949d2b533292410650d5249f7fbf72c7167a385561b8583c" exitCode=0 Feb 25 16:25:24 crc kubenswrapper[4937]: I0225 16:25:24.114679 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" event={"ID":"1bd696e6-be36-4b9e-9f00-9ba293305842","Type":"ContainerDied","Data":"1430019dc9c1f13d949d2b533292410650d5249f7fbf72c7167a385561b8583c"} Feb 25 16:25:25 crc kubenswrapper[4937]: I0225 16:25:25.735674 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" Feb 25 16:25:25 crc kubenswrapper[4937]: I0225 16:25:25.832410 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bd696e6-be36-4b9e-9f00-9ba293305842-ssh-key-openstack-edpm-ipam\") pod \"1bd696e6-be36-4b9e-9f00-9ba293305842\" (UID: \"1bd696e6-be36-4b9e-9f00-9ba293305842\") " Feb 25 16:25:25 crc kubenswrapper[4937]: I0225 16:25:25.832741 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bd696e6-be36-4b9e-9f00-9ba293305842-inventory\") pod \"1bd696e6-be36-4b9e-9f00-9ba293305842\" (UID: \"1bd696e6-be36-4b9e-9f00-9ba293305842\") " Feb 25 16:25:25 crc kubenswrapper[4937]: I0225 16:25:25.832796 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l75b\" (UniqueName: \"kubernetes.io/projected/1bd696e6-be36-4b9e-9f00-9ba293305842-kube-api-access-8l75b\") pod \"1bd696e6-be36-4b9e-9f00-9ba293305842\" (UID: \"1bd696e6-be36-4b9e-9f00-9ba293305842\") " Feb 25 16:25:25 crc kubenswrapper[4937]: I0225 16:25:25.837567 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bd696e6-be36-4b9e-9f00-9ba293305842-kube-api-access-8l75b" (OuterVolumeSpecName: "kube-api-access-8l75b") pod "1bd696e6-be36-4b9e-9f00-9ba293305842" (UID: "1bd696e6-be36-4b9e-9f00-9ba293305842"). InnerVolumeSpecName "kube-api-access-8l75b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:25:25 crc kubenswrapper[4937]: I0225 16:25:25.861690 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd696e6-be36-4b9e-9f00-9ba293305842-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1bd696e6-be36-4b9e-9f00-9ba293305842" (UID: "1bd696e6-be36-4b9e-9f00-9ba293305842"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:25:25 crc kubenswrapper[4937]: I0225 16:25:25.861887 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd696e6-be36-4b9e-9f00-9ba293305842-inventory" (OuterVolumeSpecName: "inventory") pod "1bd696e6-be36-4b9e-9f00-9ba293305842" (UID: "1bd696e6-be36-4b9e-9f00-9ba293305842"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:25:25 crc kubenswrapper[4937]: I0225 16:25:25.935370 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bd696e6-be36-4b9e-9f00-9ba293305842-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:25 crc kubenswrapper[4937]: I0225 16:25:25.935401 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bd696e6-be36-4b9e-9f00-9ba293305842-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:25 crc kubenswrapper[4937]: I0225 16:25:25.935414 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l75b\" (UniqueName: \"kubernetes.io/projected/1bd696e6-be36-4b9e-9f00-9ba293305842-kube-api-access-8l75b\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.137231 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" event={"ID":"1bd696e6-be36-4b9e-9f00-9ba293305842","Type":"ContainerDied","Data":"abc55d99185d868c12d71abfa748b66be13df23001384bbc9145bbee77e60027"} Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.137280 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abc55d99185d868c12d71abfa748b66be13df23001384bbc9145bbee77e60027" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.137301 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.240542 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v5wvt"] Feb 25 16:25:26 crc kubenswrapper[4937]: E0225 16:25:26.241009 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc348e39-415e-461a-816e-478bae8b63a1" containerName="extract-content" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.241053 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc348e39-415e-461a-816e-478bae8b63a1" containerName="extract-content" Feb 25 16:25:26 crc kubenswrapper[4937]: E0225 16:25:26.241085 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc348e39-415e-461a-816e-478bae8b63a1" containerName="registry-server" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.241094 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc348e39-415e-461a-816e-478bae8b63a1" containerName="registry-server" Feb 25 16:25:26 crc kubenswrapper[4937]: E0225 16:25:26.241122 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc348e39-415e-461a-816e-478bae8b63a1" containerName="extract-utilities" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.241131 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc348e39-415e-461a-816e-478bae8b63a1" containerName="extract-utilities" Feb 25 16:25:26 crc kubenswrapper[4937]: E0225 16:25:26.241147 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd696e6-be36-4b9e-9f00-9ba293305842" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.241155 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd696e6-be36-4b9e-9f00-9ba293305842" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.241398 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd696e6-be36-4b9e-9f00-9ba293305842" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.241428 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc348e39-415e-461a-816e-478bae8b63a1" containerName="registry-server" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.242276 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.253429 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.253829 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.254028 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.254226 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.259002 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v5wvt"] Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.341137 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzlp4\" (UniqueName: \"kubernetes.io/projected/2793b4d3-40ec-416d-8a93-0bb9b23ab909-kube-api-access-bzlp4\") pod \"ssh-known-hosts-edpm-deployment-v5wvt\" (UID: \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.341264 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2793b4d3-40ec-416d-8a93-0bb9b23ab909-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v5wvt\" (UID: \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.341295 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2793b4d3-40ec-416d-8a93-0bb9b23ab909-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v5wvt\" (UID: \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.443080 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2793b4d3-40ec-416d-8a93-0bb9b23ab909-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v5wvt\" (UID: \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.443122 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2793b4d3-40ec-416d-8a93-0bb9b23ab909-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v5wvt\" (UID: \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.443296 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzlp4\" (UniqueName: \"kubernetes.io/projected/2793b4d3-40ec-416d-8a93-0bb9b23ab909-kube-api-access-bzlp4\") pod \"ssh-known-hosts-edpm-deployment-v5wvt\" (UID: \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.447365 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2793b4d3-40ec-416d-8a93-0bb9b23ab909-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v5wvt\" (UID: \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.450463 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2793b4d3-40ec-416d-8a93-0bb9b23ab909-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v5wvt\" (UID: \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.460029 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzlp4\" (UniqueName: \"kubernetes.io/projected/2793b4d3-40ec-416d-8a93-0bb9b23ab909-kube-api-access-bzlp4\") pod \"ssh-known-hosts-edpm-deployment-v5wvt\" (UID: \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" Feb 25 16:25:26 crc kubenswrapper[4937]: I0225 16:25:26.564082 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" Feb 25 16:25:27 crc kubenswrapper[4937]: I0225 16:25:27.136731 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v5wvt"] Feb 25 16:25:28 crc kubenswrapper[4937]: I0225 16:25:28.157946 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" event={"ID":"2793b4d3-40ec-416d-8a93-0bb9b23ab909","Type":"ContainerStarted","Data":"adda19525f5bd36f642335274a18fb599d825535f8c114d6ba067dc1767d07b2"} Feb 25 16:25:28 crc kubenswrapper[4937]: I0225 16:25:28.158569 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" event={"ID":"2793b4d3-40ec-416d-8a93-0bb9b23ab909","Type":"ContainerStarted","Data":"e0231756520f2c43a6c3206f47276545ccfc946dc5f1cd4df7adbe859897f2b0"} Feb 25 16:25:28 crc kubenswrapper[4937]: I0225 16:25:28.178831 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" podStartSLOduration=1.727758533 podStartE2EDuration="2.178804833s" podCreationTimestamp="2026-02-25 16:25:26 +0000 UTC" firstStartedPulling="2026-02-25 16:25:27.145798354 +0000 UTC m=+2378.159190244" lastFinishedPulling="2026-02-25 16:25:27.596844654 +0000 UTC m=+2378.610236544" observedRunningTime="2026-02-25 16:25:28.176900715 +0000 UTC m=+2379.190292615" watchObservedRunningTime="2026-02-25 16:25:28.178804833 +0000 UTC m=+2379.192196763" Feb 25 16:25:34 crc kubenswrapper[4937]: I0225 16:25:34.216080 4937 generic.go:334] "Generic (PLEG): container finished" podID="2793b4d3-40ec-416d-8a93-0bb9b23ab909" containerID="adda19525f5bd36f642335274a18fb599d825535f8c114d6ba067dc1767d07b2" exitCode=0 Feb 25 16:25:34 crc kubenswrapper[4937]: I0225 16:25:34.216199 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" event={"ID":"2793b4d3-40ec-416d-8a93-0bb9b23ab909","Type":"ContainerDied","Data":"adda19525f5bd36f642335274a18fb599d825535f8c114d6ba067dc1767d07b2"} Feb 25 16:25:35 crc kubenswrapper[4937]: I0225 16:25:35.047532 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-7pb8b"] Feb 25 16:25:35 crc kubenswrapper[4937]: I0225 16:25:35.057451 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-7pb8b"] Feb 25 16:25:35 crc kubenswrapper[4937]: I0225 16:25:35.379200 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c91384-797d-4ca1-8a29-f800994d26b7" path="/var/lib/kubelet/pods/18c91384-797d-4ca1-8a29-f800994d26b7/volumes" Feb 25 16:25:35 crc kubenswrapper[4937]: I0225 16:25:35.822215 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" Feb 25 16:25:35 crc kubenswrapper[4937]: I0225 16:25:35.869681 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2793b4d3-40ec-416d-8a93-0bb9b23ab909-ssh-key-openstack-edpm-ipam\") pod \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\" (UID: \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\") " Feb 25 16:25:35 crc kubenswrapper[4937]: I0225 16:25:35.870793 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzlp4\" (UniqueName: \"kubernetes.io/projected/2793b4d3-40ec-416d-8a93-0bb9b23ab909-kube-api-access-bzlp4\") pod \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\" (UID: \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\") " Feb 25 16:25:35 crc kubenswrapper[4937]: I0225 16:25:35.871080 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2793b4d3-40ec-416d-8a93-0bb9b23ab909-inventory-0\") pod \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\" (UID: \"2793b4d3-40ec-416d-8a93-0bb9b23ab909\") " Feb 25 16:25:35 crc kubenswrapper[4937]: I0225 16:25:35.879019 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2793b4d3-40ec-416d-8a93-0bb9b23ab909-kube-api-access-bzlp4" (OuterVolumeSpecName: "kube-api-access-bzlp4") pod "2793b4d3-40ec-416d-8a93-0bb9b23ab909" (UID: "2793b4d3-40ec-416d-8a93-0bb9b23ab909"). InnerVolumeSpecName "kube-api-access-bzlp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:25:35 crc kubenswrapper[4937]: I0225 16:25:35.902356 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2793b4d3-40ec-416d-8a93-0bb9b23ab909-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "2793b4d3-40ec-416d-8a93-0bb9b23ab909" (UID: "2793b4d3-40ec-416d-8a93-0bb9b23ab909"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:25:35 crc kubenswrapper[4937]: I0225 16:25:35.908507 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2793b4d3-40ec-416d-8a93-0bb9b23ab909-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2793b4d3-40ec-416d-8a93-0bb9b23ab909" (UID: "2793b4d3-40ec-416d-8a93-0bb9b23ab909"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:25:35 crc kubenswrapper[4937]: I0225 16:25:35.973621 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzlp4\" (UniqueName: \"kubernetes.io/projected/2793b4d3-40ec-416d-8a93-0bb9b23ab909-kube-api-access-bzlp4\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:35 crc kubenswrapper[4937]: I0225 16:25:35.973653 4937 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2793b4d3-40ec-416d-8a93-0bb9b23ab909-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:35 crc kubenswrapper[4937]: I0225 16:25:35.973665 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2793b4d3-40ec-416d-8a93-0bb9b23ab909-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.242026 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" event={"ID":"2793b4d3-40ec-416d-8a93-0bb9b23ab909","Type":"ContainerDied","Data":"e0231756520f2c43a6c3206f47276545ccfc946dc5f1cd4df7adbe859897f2b0"} Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.242067 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0231756520f2c43a6c3206f47276545ccfc946dc5f1cd4df7adbe859897f2b0" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.242085 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v5wvt" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.322088 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd"] Feb 25 16:25:36 crc kubenswrapper[4937]: E0225 16:25:36.322661 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2793b4d3-40ec-416d-8a93-0bb9b23ab909" containerName="ssh-known-hosts-edpm-deployment" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.322686 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="2793b4d3-40ec-416d-8a93-0bb9b23ab909" containerName="ssh-known-hosts-edpm-deployment" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.322974 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="2793b4d3-40ec-416d-8a93-0bb9b23ab909" containerName="ssh-known-hosts-edpm-deployment" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.323984 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.326519 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.329428 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.329447 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.332588 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd"] Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.338190 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.383214 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e83917-e8a9-4ec3-9714-591147de094e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hxbrd\" (UID: \"f9e83917-e8a9-4ec3-9714-591147de094e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.383306 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnzn2\" (UniqueName: \"kubernetes.io/projected/f9e83917-e8a9-4ec3-9714-591147de094e-kube-api-access-hnzn2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hxbrd\" (UID: \"f9e83917-e8a9-4ec3-9714-591147de094e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.383406 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9e83917-e8a9-4ec3-9714-591147de094e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hxbrd\" (UID: \"f9e83917-e8a9-4ec3-9714-591147de094e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.485634 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e83917-e8a9-4ec3-9714-591147de094e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hxbrd\" (UID: \"f9e83917-e8a9-4ec3-9714-591147de094e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.486133 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnzn2\" (UniqueName: \"kubernetes.io/projected/f9e83917-e8a9-4ec3-9714-591147de094e-kube-api-access-hnzn2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hxbrd\" (UID: \"f9e83917-e8a9-4ec3-9714-591147de094e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.486254 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9e83917-e8a9-4ec3-9714-591147de094e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hxbrd\" (UID: \"f9e83917-e8a9-4ec3-9714-591147de094e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.493107 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e83917-e8a9-4ec3-9714-591147de094e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hxbrd\" (UID: \"f9e83917-e8a9-4ec3-9714-591147de094e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.496027 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9e83917-e8a9-4ec3-9714-591147de094e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hxbrd\" (UID: \"f9e83917-e8a9-4ec3-9714-591147de094e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.504845 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnzn2\" (UniqueName: \"kubernetes.io/projected/f9e83917-e8a9-4ec3-9714-591147de094e-kube-api-access-hnzn2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hxbrd\" (UID: \"f9e83917-e8a9-4ec3-9714-591147de094e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" Feb 25 16:25:36 crc kubenswrapper[4937]: I0225 16:25:36.658665 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" Feb 25 16:25:37 crc kubenswrapper[4937]: I0225 16:25:37.194755 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd"] Feb 25 16:25:37 crc kubenswrapper[4937]: I0225 16:25:37.257308 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" event={"ID":"f9e83917-e8a9-4ec3-9714-591147de094e","Type":"ContainerStarted","Data":"3ceb92e36baeaa456321d1089bbbb1f51a08e77d24570da1baf40f04de84b7e7"} Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.271748 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" event={"ID":"f9e83917-e8a9-4ec3-9714-591147de094e","Type":"ContainerStarted","Data":"4823c31a2ff9ef2bdf0045f3886e42d8c16865021a53e87507116251266daf26"} Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.301287 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" podStartSLOduration=1.790023114 podStartE2EDuration="2.301271472s" podCreationTimestamp="2026-02-25 16:25:36 +0000 UTC" firstStartedPulling="2026-02-25 16:25:37.199671924 +0000 UTC m=+2388.213063814" lastFinishedPulling="2026-02-25 16:25:37.710920282 +0000 UTC m=+2388.724312172" observedRunningTime="2026-02-25 16:25:38.295038736 +0000 UTC m=+2389.308430626" watchObservedRunningTime="2026-02-25 16:25:38.301271472 +0000 UTC m=+2389.314663362" Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.591621 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bs5qq"] Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.594564 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.607380 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bs5qq"] Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.639471 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjq28\" (UniqueName: \"kubernetes.io/projected/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-kube-api-access-tjq28\") pod \"community-operators-bs5qq\" (UID: \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\") " pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.639717 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-utilities\") pod \"community-operators-bs5qq\" (UID: \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\") " pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.639776 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-catalog-content\") pod \"community-operators-bs5qq\" (UID: \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\") " pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.741927 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-utilities\") pod \"community-operators-bs5qq\" (UID: \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\") " pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.741988 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-catalog-content\") pod \"community-operators-bs5qq\" (UID: \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\") " pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.742059 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjq28\" (UniqueName: \"kubernetes.io/projected/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-kube-api-access-tjq28\") pod \"community-operators-bs5qq\" (UID: \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\") " pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.742435 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-utilities\") pod \"community-operators-bs5qq\" (UID: \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\") " pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.742666 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-catalog-content\") pod \"community-operators-bs5qq\" (UID: \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\") " pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.765398 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjq28\" (UniqueName: \"kubernetes.io/projected/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-kube-api-access-tjq28\") pod \"community-operators-bs5qq\" (UID: \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\") " pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:38 crc kubenswrapper[4937]: I0225 16:25:38.949889 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:39 crc kubenswrapper[4937]: I0225 16:25:39.430786 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bs5qq"] Feb 25 16:25:39 crc kubenswrapper[4937]: W0225 16:25:39.432512 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a3c8ad6_e2ff_4247_8c61_0fc17017ef26.slice/crio-259f66aae0d8f5bf70092474e4fc361565d74fc3eee66347727e643f298b4057 WatchSource:0}: Error finding container 259f66aae0d8f5bf70092474e4fc361565d74fc3eee66347727e643f298b4057: Status 404 returned error can't find the container with id 259f66aae0d8f5bf70092474e4fc361565d74fc3eee66347727e643f298b4057 Feb 25 16:25:40 crc kubenswrapper[4937]: I0225 16:25:40.294776 4937 generic.go:334] "Generic (PLEG): container finished" podID="8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" containerID="793c288ebe5e35f59ea410258345c32cdefab6f752e00079e9db0c8c50e46b92" exitCode=0 Feb 25 16:25:40 crc kubenswrapper[4937]: I0225 16:25:40.294852 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bs5qq" event={"ID":"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26","Type":"ContainerDied","Data":"793c288ebe5e35f59ea410258345c32cdefab6f752e00079e9db0c8c50e46b92"} Feb 25 16:25:40 crc kubenswrapper[4937]: I0225 16:25:40.295188 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bs5qq" event={"ID":"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26","Type":"ContainerStarted","Data":"259f66aae0d8f5bf70092474e4fc361565d74fc3eee66347727e643f298b4057"} Feb 25 16:25:40 crc kubenswrapper[4937]: I0225 16:25:40.742911 4937 scope.go:117] "RemoveContainer" containerID="f833be4dfeb87ff3970dca032bfe28374fa43cab8bcdd748c1c3405ec0a1b478" Feb 25 16:25:40 crc kubenswrapper[4937]: I0225 16:25:40.822097 4937 scope.go:117] "RemoveContainer" containerID="818b0815324ce9f2eb9cf06dc48a11c257685002beb36cff09bf2a15122fa9df" Feb 25 16:25:40 crc kubenswrapper[4937]: I0225 16:25:40.866026 4937 scope.go:117] "RemoveContainer" containerID="08fb0f4b3814f3eb26b2a568a15dab936c66c22e0f142bc12dd7a55f5ce2f1d9" Feb 25 16:25:41 crc kubenswrapper[4937]: I0225 16:25:41.495237 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:25:41 crc kubenswrapper[4937]: I0225 16:25:41.495768 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:25:41 crc kubenswrapper[4937]: I0225 16:25:41.495840 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 16:25:41 crc kubenswrapper[4937]: I0225 16:25:41.497060 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 16:25:41 crc kubenswrapper[4937]: I0225 16:25:41.497185 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" gracePeriod=600 Feb 25 16:25:41 crc kubenswrapper[4937]: E0225 16:25:41.677204 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:25:42 crc kubenswrapper[4937]: I0225 16:25:42.323045 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" exitCode=0 Feb 25 16:25:42 crc kubenswrapper[4937]: I0225 16:25:42.323144 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3"} Feb 25 16:25:42 crc kubenswrapper[4937]: I0225 16:25:42.323219 4937 scope.go:117] "RemoveContainer" containerID="a1902f50a856ad1bac31a5abd173e354d882341ff395f33b062b6ec2ed08e38e" Feb 25 16:25:42 crc kubenswrapper[4937]: I0225 16:25:42.324384 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:25:42 crc kubenswrapper[4937]: E0225 16:25:42.325060 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:25:42 crc kubenswrapper[4937]: I0225 16:25:42.327942 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bs5qq" event={"ID":"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26","Type":"ContainerStarted","Data":"b43b4502938546cf758d6e64561a0e464a8fd264856bc261970e2966c4b10e31"} Feb 25 16:25:43 crc kubenswrapper[4937]: I0225 16:25:43.342817 4937 generic.go:334] "Generic (PLEG): container finished" podID="8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" containerID="b43b4502938546cf758d6e64561a0e464a8fd264856bc261970e2966c4b10e31" exitCode=0 Feb 25 16:25:43 crc kubenswrapper[4937]: I0225 16:25:43.342918 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bs5qq" event={"ID":"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26","Type":"ContainerDied","Data":"b43b4502938546cf758d6e64561a0e464a8fd264856bc261970e2966c4b10e31"} Feb 25 16:25:44 crc kubenswrapper[4937]: I0225 16:25:44.355693 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bs5qq" event={"ID":"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26","Type":"ContainerStarted","Data":"14973d28fc430d167c40649e3f19cf342a8152351b30bc2cbd789d7622f5f332"} Feb 25 16:25:44 crc kubenswrapper[4937]: I0225 16:25:44.385480 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bs5qq" podStartSLOduration=2.972015388 podStartE2EDuration="6.385448762s" podCreationTimestamp="2026-02-25 16:25:38 +0000 UTC" firstStartedPulling="2026-02-25 16:25:40.298000713 +0000 UTC m=+2391.311392633" lastFinishedPulling="2026-02-25 16:25:43.711434117 +0000 UTC m=+2394.724826007" observedRunningTime="2026-02-25 16:25:44.379203016 +0000 UTC m=+2395.392594926" watchObservedRunningTime="2026-02-25 16:25:44.385448762 +0000 UTC m=+2395.398840662" Feb 25 16:25:46 crc kubenswrapper[4937]: I0225 16:25:46.382154 4937 generic.go:334] "Generic (PLEG): container finished" podID="f9e83917-e8a9-4ec3-9714-591147de094e" containerID="4823c31a2ff9ef2bdf0045f3886e42d8c16865021a53e87507116251266daf26" exitCode=0 Feb 25 16:25:46 crc kubenswrapper[4937]: I0225 16:25:46.382206 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" event={"ID":"f9e83917-e8a9-4ec3-9714-591147de094e","Type":"ContainerDied","Data":"4823c31a2ff9ef2bdf0045f3886e42d8c16865021a53e87507116251266daf26"} Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.061221 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.190953 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e83917-e8a9-4ec3-9714-591147de094e-inventory\") pod \"f9e83917-e8a9-4ec3-9714-591147de094e\" (UID: \"f9e83917-e8a9-4ec3-9714-591147de094e\") " Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.191179 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9e83917-e8a9-4ec3-9714-591147de094e-ssh-key-openstack-edpm-ipam\") pod \"f9e83917-e8a9-4ec3-9714-591147de094e\" (UID: \"f9e83917-e8a9-4ec3-9714-591147de094e\") " Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.191201 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnzn2\" (UniqueName: \"kubernetes.io/projected/f9e83917-e8a9-4ec3-9714-591147de094e-kube-api-access-hnzn2\") pod \"f9e83917-e8a9-4ec3-9714-591147de094e\" (UID: \"f9e83917-e8a9-4ec3-9714-591147de094e\") " Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.196091 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e83917-e8a9-4ec3-9714-591147de094e-kube-api-access-hnzn2" (OuterVolumeSpecName: "kube-api-access-hnzn2") pod "f9e83917-e8a9-4ec3-9714-591147de094e" (UID: "f9e83917-e8a9-4ec3-9714-591147de094e"). InnerVolumeSpecName "kube-api-access-hnzn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.219644 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e83917-e8a9-4ec3-9714-591147de094e-inventory" (OuterVolumeSpecName: "inventory") pod "f9e83917-e8a9-4ec3-9714-591147de094e" (UID: "f9e83917-e8a9-4ec3-9714-591147de094e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.236684 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e83917-e8a9-4ec3-9714-591147de094e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f9e83917-e8a9-4ec3-9714-591147de094e" (UID: "f9e83917-e8a9-4ec3-9714-591147de094e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.294564 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e83917-e8a9-4ec3-9714-591147de094e-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.294608 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9e83917-e8a9-4ec3-9714-591147de094e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.294624 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnzn2\" (UniqueName: \"kubernetes.io/projected/f9e83917-e8a9-4ec3-9714-591147de094e-kube-api-access-hnzn2\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.405542 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" event={"ID":"f9e83917-e8a9-4ec3-9714-591147de094e","Type":"ContainerDied","Data":"3ceb92e36baeaa456321d1089bbbb1f51a08e77d24570da1baf40f04de84b7e7"} Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.405584 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ceb92e36baeaa456321d1089bbbb1f51a08e77d24570da1baf40f04de84b7e7" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.405654 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hxbrd" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.482846 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv"] Feb 25 16:25:48 crc kubenswrapper[4937]: E0225 16:25:48.483653 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e83917-e8a9-4ec3-9714-591147de094e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.483667 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e83917-e8a9-4ec3-9714-591147de094e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.483879 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e83917-e8a9-4ec3-9714-591147de094e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.484610 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.487869 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.488030 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.488206 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.488318 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.514079 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv"] Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.603332 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv\" (UID: \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.603580 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv\" (UID: \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.603762 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gmgd\" (UniqueName: \"kubernetes.io/projected/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-kube-api-access-7gmgd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv\" (UID: \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.706212 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv\" (UID: \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.706316 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmgd\" (UniqueName: \"kubernetes.io/projected/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-kube-api-access-7gmgd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv\" (UID: \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.706356 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv\" (UID: \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.710885 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv\" (UID: \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.718585 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv\" (UID: \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.729774 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmgd\" (UniqueName: \"kubernetes.io/projected/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-kube-api-access-7gmgd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv\" (UID: \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.800592 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.950105 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:48 crc kubenswrapper[4937]: I0225 16:25:48.950418 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:49 crc kubenswrapper[4937]: I0225 16:25:49.015334 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:49 crc kubenswrapper[4937]: I0225 16:25:49.336264 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv"] Feb 25 16:25:49 crc kubenswrapper[4937]: I0225 16:25:49.416242 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" event={"ID":"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9","Type":"ContainerStarted","Data":"f48a8af234f4af8b145de05d913695a73d44bcc6db82025e85cdb65d297dc8d1"} Feb 25 16:25:49 crc kubenswrapper[4937]: I0225 16:25:49.467988 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:49 crc kubenswrapper[4937]: I0225 16:25:49.517961 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bs5qq"] Feb 25 16:25:50 crc kubenswrapper[4937]: I0225 16:25:50.429838 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" event={"ID":"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9","Type":"ContainerStarted","Data":"a520bbf6ec44f52dac217f61e6ddc7f6cc9cc13539b2b55b3f1a3ea7d307df58"} Feb 25 16:25:50 crc kubenswrapper[4937]: I0225 16:25:50.459403 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" podStartSLOduration=1.98367604 podStartE2EDuration="2.459374107s" podCreationTimestamp="2026-02-25 16:25:48 +0000 UTC" firstStartedPulling="2026-02-25 16:25:49.351194205 +0000 UTC m=+2400.364586085" lastFinishedPulling="2026-02-25 16:25:49.826892222 +0000 UTC m=+2400.840284152" observedRunningTime="2026-02-25 16:25:50.451652544 +0000 UTC m=+2401.465044464" watchObservedRunningTime="2026-02-25 16:25:50.459374107 +0000 UTC m=+2401.472766017" Feb 25 16:25:51 crc kubenswrapper[4937]: I0225 16:25:51.439194 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bs5qq" podUID="8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" containerName="registry-server" containerID="cri-o://14973d28fc430d167c40649e3f19cf342a8152351b30bc2cbd789d7622f5f332" gracePeriod=2 Feb 25 16:25:51 crc kubenswrapper[4937]: I0225 16:25:51.994906 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.194314 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjq28\" (UniqueName: \"kubernetes.io/projected/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-kube-api-access-tjq28\") pod \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\" (UID: \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\") " Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.194516 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-catalog-content\") pod \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\" (UID: \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\") " Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.194644 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-utilities\") pod \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\" (UID: \"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26\") " Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.195374 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-utilities" (OuterVolumeSpecName: "utilities") pod "8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" (UID: "8a3c8ad6-e2ff-4247-8c61-0fc17017ef26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.196332 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.199741 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-kube-api-access-tjq28" (OuterVolumeSpecName: "kube-api-access-tjq28") pod "8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" (UID: "8a3c8ad6-e2ff-4247-8c61-0fc17017ef26"). InnerVolumeSpecName "kube-api-access-tjq28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.298600 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjq28\" (UniqueName: \"kubernetes.io/projected/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-kube-api-access-tjq28\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.459615 4937 generic.go:334] "Generic (PLEG): container finished" podID="8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" containerID="14973d28fc430d167c40649e3f19cf342a8152351b30bc2cbd789d7622f5f332" exitCode=0 Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.459658 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bs5qq" event={"ID":"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26","Type":"ContainerDied","Data":"14973d28fc430d167c40649e3f19cf342a8152351b30bc2cbd789d7622f5f332"} Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.459691 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bs5qq" event={"ID":"8a3c8ad6-e2ff-4247-8c61-0fc17017ef26","Type":"ContainerDied","Data":"259f66aae0d8f5bf70092474e4fc361565d74fc3eee66347727e643f298b4057"} Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.459712 4937 scope.go:117] "RemoveContainer" containerID="14973d28fc430d167c40649e3f19cf342a8152351b30bc2cbd789d7622f5f332" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.459800 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bs5qq" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.487648 4937 scope.go:117] "RemoveContainer" containerID="b43b4502938546cf758d6e64561a0e464a8fd264856bc261970e2966c4b10e31" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.519609 4937 scope.go:117] "RemoveContainer" containerID="793c288ebe5e35f59ea410258345c32cdefab6f752e00079e9db0c8c50e46b92" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.563129 4937 scope.go:117] "RemoveContainer" containerID="14973d28fc430d167c40649e3f19cf342a8152351b30bc2cbd789d7622f5f332" Feb 25 16:25:52 crc kubenswrapper[4937]: E0225 16:25:52.563644 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14973d28fc430d167c40649e3f19cf342a8152351b30bc2cbd789d7622f5f332\": container with ID starting with 14973d28fc430d167c40649e3f19cf342a8152351b30bc2cbd789d7622f5f332 not found: ID does not exist" containerID="14973d28fc430d167c40649e3f19cf342a8152351b30bc2cbd789d7622f5f332" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.563684 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14973d28fc430d167c40649e3f19cf342a8152351b30bc2cbd789d7622f5f332"} err="failed to get container status \"14973d28fc430d167c40649e3f19cf342a8152351b30bc2cbd789d7622f5f332\": rpc error: code = NotFound desc = could not find container \"14973d28fc430d167c40649e3f19cf342a8152351b30bc2cbd789d7622f5f332\": container with ID starting with 14973d28fc430d167c40649e3f19cf342a8152351b30bc2cbd789d7622f5f332 not found: ID does not exist" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.563704 4937 scope.go:117] "RemoveContainer" containerID="b43b4502938546cf758d6e64561a0e464a8fd264856bc261970e2966c4b10e31" Feb 25 16:25:52 crc kubenswrapper[4937]: E0225 16:25:52.564053 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b43b4502938546cf758d6e64561a0e464a8fd264856bc261970e2966c4b10e31\": container with ID starting with b43b4502938546cf758d6e64561a0e464a8fd264856bc261970e2966c4b10e31 not found: ID does not exist" containerID="b43b4502938546cf758d6e64561a0e464a8fd264856bc261970e2966c4b10e31" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.564087 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43b4502938546cf758d6e64561a0e464a8fd264856bc261970e2966c4b10e31"} err="failed to get container status \"b43b4502938546cf758d6e64561a0e464a8fd264856bc261970e2966c4b10e31\": rpc error: code = NotFound desc = could not find container \"b43b4502938546cf758d6e64561a0e464a8fd264856bc261970e2966c4b10e31\": container with ID starting with b43b4502938546cf758d6e64561a0e464a8fd264856bc261970e2966c4b10e31 not found: ID does not exist" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.564115 4937 scope.go:117] "RemoveContainer" containerID="793c288ebe5e35f59ea410258345c32cdefab6f752e00079e9db0c8c50e46b92" Feb 25 16:25:52 crc kubenswrapper[4937]: E0225 16:25:52.564384 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"793c288ebe5e35f59ea410258345c32cdefab6f752e00079e9db0c8c50e46b92\": container with ID starting with 793c288ebe5e35f59ea410258345c32cdefab6f752e00079e9db0c8c50e46b92 not found: ID does not exist" containerID="793c288ebe5e35f59ea410258345c32cdefab6f752e00079e9db0c8c50e46b92" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.564412 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"793c288ebe5e35f59ea410258345c32cdefab6f752e00079e9db0c8c50e46b92"} err="failed to get container status \"793c288ebe5e35f59ea410258345c32cdefab6f752e00079e9db0c8c50e46b92\": rpc error: code = NotFound desc = could not find container \"793c288ebe5e35f59ea410258345c32cdefab6f752e00079e9db0c8c50e46b92\": container with ID starting with 793c288ebe5e35f59ea410258345c32cdefab6f752e00079e9db0c8c50e46b92 not found: ID does not exist" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.805130 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" (UID: "8a3c8ad6-e2ff-4247-8c61-0fc17017ef26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:25:52 crc kubenswrapper[4937]: I0225 16:25:52.807277 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:25:53 crc kubenswrapper[4937]: I0225 16:25:53.103315 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bs5qq"] Feb 25 16:25:53 crc kubenswrapper[4937]: I0225 16:25:53.120540 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bs5qq"] Feb 25 16:25:53 crc kubenswrapper[4937]: I0225 16:25:53.387132 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" path="/var/lib/kubelet/pods/8a3c8ad6-e2ff-4247-8c61-0fc17017ef26/volumes" Feb 25 16:25:56 crc kubenswrapper[4937]: I0225 16:25:56.367707 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:25:56 crc kubenswrapper[4937]: E0225 16:25:56.368345 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:25:59 crc kubenswrapper[4937]: I0225 16:25:59.535427 4937 generic.go:334] "Generic (PLEG): container finished" podID="7edeb14d-a4c4-402a-a45f-b30a6f23ffe9" containerID="a520bbf6ec44f52dac217f61e6ddc7f6cc9cc13539b2b55b3f1a3ea7d307df58" exitCode=0 Feb 25 16:25:59 crc kubenswrapper[4937]: I0225 16:25:59.535514 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" event={"ID":"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9","Type":"ContainerDied","Data":"a520bbf6ec44f52dac217f61e6ddc7f6cc9cc13539b2b55b3f1a3ea7d307df58"} Feb 25 16:26:00 crc kubenswrapper[4937]: I0225 16:26:00.176617 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533946-gs99v"] Feb 25 16:26:00 crc kubenswrapper[4937]: E0225 16:26:00.177960 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" containerName="extract-utilities" Feb 25 16:26:00 crc kubenswrapper[4937]: I0225 16:26:00.178007 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" containerName="extract-utilities" Feb 25 16:26:00 crc kubenswrapper[4937]: E0225 16:26:00.178081 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" containerName="registry-server" Feb 25 16:26:00 crc kubenswrapper[4937]: I0225 16:26:00.178100 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" containerName="registry-server" Feb 25 16:26:00 crc kubenswrapper[4937]: E0225 16:26:00.178165 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" containerName="extract-content" Feb 25 16:26:00 crc kubenswrapper[4937]: I0225 16:26:00.178185 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" containerName="extract-content" Feb 25 16:26:00 crc kubenswrapper[4937]: I0225 16:26:00.179129 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3c8ad6-e2ff-4247-8c61-0fc17017ef26" containerName="registry-server" Feb 25 16:26:00 crc kubenswrapper[4937]: I0225 16:26:00.180902 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533946-gs99v" Feb 25 16:26:00 crc kubenswrapper[4937]: I0225 16:26:00.185018 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:26:00 crc kubenswrapper[4937]: I0225 16:26:00.185339 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:26:00 crc kubenswrapper[4937]: I0225 16:26:00.185609 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:26:00 crc kubenswrapper[4937]: I0225 16:26:00.191676 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533946-gs99v"] Feb 25 16:26:00 crc kubenswrapper[4937]: I0225 16:26:00.376502 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98v96\" (UniqueName: \"kubernetes.io/projected/17608ad1-2ff4-446f-af20-a509268ac519-kube-api-access-98v96\") pod \"auto-csr-approver-29533946-gs99v\" (UID: \"17608ad1-2ff4-446f-af20-a509268ac519\") " pod="openshift-infra/auto-csr-approver-29533946-gs99v" Feb 25 16:26:00 crc kubenswrapper[4937]: I0225 16:26:00.481135 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98v96\" (UniqueName: \"kubernetes.io/projected/17608ad1-2ff4-446f-af20-a509268ac519-kube-api-access-98v96\") pod \"auto-csr-approver-29533946-gs99v\" (UID: \"17608ad1-2ff4-446f-af20-a509268ac519\") " pod="openshift-infra/auto-csr-approver-29533946-gs99v" Feb 25 16:26:00 crc kubenswrapper[4937]: I0225 16:26:00.515848 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98v96\" (UniqueName: \"kubernetes.io/projected/17608ad1-2ff4-446f-af20-a509268ac519-kube-api-access-98v96\") pod \"auto-csr-approver-29533946-gs99v\" (UID: \"17608ad1-2ff4-446f-af20-a509268ac519\") " pod="openshift-infra/auto-csr-approver-29533946-gs99v" Feb 25 16:26:00 crc kubenswrapper[4937]: I0225 16:26:00.808633 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533946-gs99v" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.043056 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.194645 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-ssh-key-openstack-edpm-ipam\") pod \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\" (UID: \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\") " Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.194696 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gmgd\" (UniqueName: \"kubernetes.io/projected/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-kube-api-access-7gmgd\") pod \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\" (UID: \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\") " Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.194948 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-inventory\") pod \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\" (UID: \"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9\") " Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.203574 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-kube-api-access-7gmgd" (OuterVolumeSpecName: "kube-api-access-7gmgd") pod "7edeb14d-a4c4-402a-a45f-b30a6f23ffe9" (UID: "7edeb14d-a4c4-402a-a45f-b30a6f23ffe9"). InnerVolumeSpecName "kube-api-access-7gmgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.241005 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-inventory" (OuterVolumeSpecName: "inventory") pod "7edeb14d-a4c4-402a-a45f-b30a6f23ffe9" (UID: "7edeb14d-a4c4-402a-a45f-b30a6f23ffe9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.246700 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7edeb14d-a4c4-402a-a45f-b30a6f23ffe9" (UID: "7edeb14d-a4c4-402a-a45f-b30a6f23ffe9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.289762 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533946-gs99v"] Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.297807 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.297877 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.297906 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gmgd\" (UniqueName: \"kubernetes.io/projected/7edeb14d-a4c4-402a-a45f-b30a6f23ffe9-kube-api-access-7gmgd\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.562126 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" event={"ID":"7edeb14d-a4c4-402a-a45f-b30a6f23ffe9","Type":"ContainerDied","Data":"f48a8af234f4af8b145de05d913695a73d44bcc6db82025e85cdb65d297dc8d1"} Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.562186 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f48a8af234f4af8b145de05d913695a73d44bcc6db82025e85cdb65d297dc8d1" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.562260 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.574061 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533946-gs99v" event={"ID":"17608ad1-2ff4-446f-af20-a509268ac519","Type":"ContainerStarted","Data":"066f5e4a1a3f4947db58a7cb2629ddbf441142e7f3f202e08874a27f1071a676"} Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.660916 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs"] Feb 25 16:26:01 crc kubenswrapper[4937]: E0225 16:26:01.661337 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7edeb14d-a4c4-402a-a45f-b30a6f23ffe9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.661354 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="7edeb14d-a4c4-402a-a45f-b30a6f23ffe9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.661567 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="7edeb14d-a4c4-402a-a45f-b30a6f23ffe9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.662228 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.664360 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.664377 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.664359 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.664588 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.665438 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.667090 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.667473 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.667637 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.684179 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs"] Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.853199 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.853552 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.853581 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.853652 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.853679 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.853710 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72m5l\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-kube-api-access-72m5l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.853748 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.853780 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.853810 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.853853 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.853872 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.853900 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.853929 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.853948 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.957124 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.957232 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.957299 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72m5l\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-kube-api-access-72m5l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.957364 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.957434 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.957535 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.957660 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.957729 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.957846 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.957926 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.957962 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.958017 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.958084 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.958128 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.962751 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.962773 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.963174 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.963455 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.963982 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.964301 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.965177 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.965215 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.966516 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.968016 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.974797 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.977232 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.978017 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:01 crc kubenswrapper[4937]: I0225 16:26:01.990158 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72m5l\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-kube-api-access-72m5l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r74cs\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.277584 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.484413 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z7xxx"] Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.488267 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.493903 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z7xxx"] Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.674978 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brhc4\" (UniqueName: \"kubernetes.io/projected/671a0a21-c761-422a-beff-41dc8f866615-kube-api-access-brhc4\") pod \"certified-operators-z7xxx\" (UID: \"671a0a21-c761-422a-beff-41dc8f866615\") " pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.675074 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671a0a21-c761-422a-beff-41dc8f866615-catalog-content\") pod \"certified-operators-z7xxx\" (UID: \"671a0a21-c761-422a-beff-41dc8f866615\") " pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.675152 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671a0a21-c761-422a-beff-41dc8f866615-utilities\") pod \"certified-operators-z7xxx\" (UID: \"671a0a21-c761-422a-beff-41dc8f866615\") " pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.777127 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671a0a21-c761-422a-beff-41dc8f866615-utilities\") pod \"certified-operators-z7xxx\" (UID: \"671a0a21-c761-422a-beff-41dc8f866615\") " pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.777397 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brhc4\" (UniqueName: \"kubernetes.io/projected/671a0a21-c761-422a-beff-41dc8f866615-kube-api-access-brhc4\") pod \"certified-operators-z7xxx\" (UID: \"671a0a21-c761-422a-beff-41dc8f866615\") " pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.777470 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671a0a21-c761-422a-beff-41dc8f866615-catalog-content\") pod \"certified-operators-z7xxx\" (UID: \"671a0a21-c761-422a-beff-41dc8f866615\") " pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.777543 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671a0a21-c761-422a-beff-41dc8f866615-utilities\") pod \"certified-operators-z7xxx\" (UID: \"671a0a21-c761-422a-beff-41dc8f866615\") " pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.777990 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671a0a21-c761-422a-beff-41dc8f866615-catalog-content\") pod \"certified-operators-z7xxx\" (UID: \"671a0a21-c761-422a-beff-41dc8f866615\") " pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.799946 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brhc4\" (UniqueName: \"kubernetes.io/projected/671a0a21-c761-422a-beff-41dc8f866615-kube-api-access-brhc4\") pod \"certified-operators-z7xxx\" (UID: \"671a0a21-c761-422a-beff-41dc8f866615\") " pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.817002 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:02 crc kubenswrapper[4937]: I0225 16:26:02.936205 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs"] Feb 25 16:26:02 crc kubenswrapper[4937]: W0225 16:26:02.955990 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd08f7150_84a3_42bf_bed8_624a7f5e2c35.slice/crio-2385f1e4300047131519ff6e535428b3319c27fb120fe55863131976e7986e2a WatchSource:0}: Error finding container 2385f1e4300047131519ff6e535428b3319c27fb120fe55863131976e7986e2a: Status 404 returned error can't find the container with id 2385f1e4300047131519ff6e535428b3319c27fb120fe55863131976e7986e2a Feb 25 16:26:03 crc kubenswrapper[4937]: I0225 16:26:03.434132 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z7xxx"] Feb 25 16:26:03 crc kubenswrapper[4937]: I0225 16:26:03.605593 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533946-gs99v" event={"ID":"17608ad1-2ff4-446f-af20-a509268ac519","Type":"ContainerStarted","Data":"b376e5ac0917e5b867cd3dddeb69853e1432bac988429aafd30e073dadee3fd5"} Feb 25 16:26:03 crc kubenswrapper[4937]: I0225 16:26:03.608630 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" event={"ID":"d08f7150-84a3-42bf-bed8-624a7f5e2c35","Type":"ContainerStarted","Data":"2385f1e4300047131519ff6e535428b3319c27fb120fe55863131976e7986e2a"} Feb 25 16:26:03 crc kubenswrapper[4937]: I0225 16:26:03.612831 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xxx" event={"ID":"671a0a21-c761-422a-beff-41dc8f866615","Type":"ContainerStarted","Data":"41f6b5ccd628a6dc013eeb597df63cbc7d7d1770d5346bd3f091f51a9cbe1c74"} Feb 25 16:26:03 crc kubenswrapper[4937]: I0225 16:26:03.612893 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xxx" event={"ID":"671a0a21-c761-422a-beff-41dc8f866615","Type":"ContainerStarted","Data":"f0a2ce3c1c592452003c532cf072c06e6430a82cfc789349e20536f3d77b62f0"} Feb 25 16:26:03 crc kubenswrapper[4937]: I0225 16:26:03.630458 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533946-gs99v" podStartSLOduration=1.89243552 podStartE2EDuration="3.630432991s" podCreationTimestamp="2026-02-25 16:26:00 +0000 UTC" firstStartedPulling="2026-02-25 16:26:01.292697105 +0000 UTC m=+2412.306088995" lastFinishedPulling="2026-02-25 16:26:03.030694576 +0000 UTC m=+2414.044086466" observedRunningTime="2026-02-25 16:26:03.623993509 +0000 UTC m=+2414.637385399" watchObservedRunningTime="2026-02-25 16:26:03.630432991 +0000 UTC m=+2414.643824891" Feb 25 16:26:03 crc kubenswrapper[4937]: I0225 16:26:03.671362 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" podStartSLOduration=2.238133012 podStartE2EDuration="2.671344105s" podCreationTimestamp="2026-02-25 16:26:01 +0000 UTC" firstStartedPulling="2026-02-25 16:26:02.962759524 +0000 UTC m=+2413.976151414" lastFinishedPulling="2026-02-25 16:26:03.395970607 +0000 UTC m=+2414.409362507" observedRunningTime="2026-02-25 16:26:03.670004762 +0000 UTC m=+2414.683396652" watchObservedRunningTime="2026-02-25 16:26:03.671344105 +0000 UTC m=+2414.684735995" Feb 25 16:26:04 crc kubenswrapper[4937]: I0225 16:26:04.626118 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" event={"ID":"d08f7150-84a3-42bf-bed8-624a7f5e2c35","Type":"ContainerStarted","Data":"42a8f122d451424a56206f9206aa4bce0f0841a66fbd058b4a77dee9ce4f4f1f"} Feb 25 16:26:04 crc kubenswrapper[4937]: I0225 16:26:04.631427 4937 generic.go:334] "Generic (PLEG): container finished" podID="671a0a21-c761-422a-beff-41dc8f866615" containerID="41f6b5ccd628a6dc013eeb597df63cbc7d7d1770d5346bd3f091f51a9cbe1c74" exitCode=0 Feb 25 16:26:04 crc kubenswrapper[4937]: I0225 16:26:04.632005 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xxx" event={"ID":"671a0a21-c761-422a-beff-41dc8f866615","Type":"ContainerDied","Data":"41f6b5ccd628a6dc013eeb597df63cbc7d7d1770d5346bd3f091f51a9cbe1c74"} Feb 25 16:26:04 crc kubenswrapper[4937]: I0225 16:26:04.635125 4937 generic.go:334] "Generic (PLEG): container finished" podID="17608ad1-2ff4-446f-af20-a509268ac519" containerID="b376e5ac0917e5b867cd3dddeb69853e1432bac988429aafd30e073dadee3fd5" exitCode=0 Feb 25 16:26:04 crc kubenswrapper[4937]: I0225 16:26:04.635181 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533946-gs99v" event={"ID":"17608ad1-2ff4-446f-af20-a509268ac519","Type":"ContainerDied","Data":"b376e5ac0917e5b867cd3dddeb69853e1432bac988429aafd30e073dadee3fd5"} Feb 25 16:26:05 crc kubenswrapper[4937]: I0225 16:26:05.644566 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xxx" event={"ID":"671a0a21-c761-422a-beff-41dc8f866615","Type":"ContainerStarted","Data":"7f2ce8d341d7acc932e8ebd40f6a6aee2e6840cd15afec319c374bb7a1e52f9f"} Feb 25 16:26:06 crc kubenswrapper[4937]: I0225 16:26:06.080573 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533946-gs99v" Feb 25 16:26:06 crc kubenswrapper[4937]: I0225 16:26:06.169274 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98v96\" (UniqueName: \"kubernetes.io/projected/17608ad1-2ff4-446f-af20-a509268ac519-kube-api-access-98v96\") pod \"17608ad1-2ff4-446f-af20-a509268ac519\" (UID: \"17608ad1-2ff4-446f-af20-a509268ac519\") " Feb 25 16:26:06 crc kubenswrapper[4937]: I0225 16:26:06.192140 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17608ad1-2ff4-446f-af20-a509268ac519-kube-api-access-98v96" (OuterVolumeSpecName: "kube-api-access-98v96") pod "17608ad1-2ff4-446f-af20-a509268ac519" (UID: "17608ad1-2ff4-446f-af20-a509268ac519"). InnerVolumeSpecName "kube-api-access-98v96". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:26:06 crc kubenswrapper[4937]: I0225 16:26:06.273717 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98v96\" (UniqueName: \"kubernetes.io/projected/17608ad1-2ff4-446f-af20-a509268ac519-kube-api-access-98v96\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:06 crc kubenswrapper[4937]: I0225 16:26:06.665713 4937 generic.go:334] "Generic (PLEG): container finished" podID="671a0a21-c761-422a-beff-41dc8f866615" containerID="7f2ce8d341d7acc932e8ebd40f6a6aee2e6840cd15afec319c374bb7a1e52f9f" exitCode=0 Feb 25 16:26:06 crc kubenswrapper[4937]: I0225 16:26:06.665829 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xxx" event={"ID":"671a0a21-c761-422a-beff-41dc8f866615","Type":"ContainerDied","Data":"7f2ce8d341d7acc932e8ebd40f6a6aee2e6840cd15afec319c374bb7a1e52f9f"} Feb 25 16:26:06 crc kubenswrapper[4937]: I0225 16:26:06.671048 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533946-gs99v" event={"ID":"17608ad1-2ff4-446f-af20-a509268ac519","Type":"ContainerDied","Data":"066f5e4a1a3f4947db58a7cb2629ddbf441142e7f3f202e08874a27f1071a676"} Feb 25 16:26:06 crc kubenswrapper[4937]: I0225 16:26:06.671087 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="066f5e4a1a3f4947db58a7cb2629ddbf441142e7f3f202e08874a27f1071a676" Feb 25 16:26:06 crc kubenswrapper[4937]: I0225 16:26:06.671134 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533946-gs99v" Feb 25 16:26:06 crc kubenswrapper[4937]: I0225 16:26:06.720150 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533940-84f7g"] Feb 25 16:26:06 crc kubenswrapper[4937]: I0225 16:26:06.731824 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533940-84f7g"] Feb 25 16:26:07 crc kubenswrapper[4937]: I0225 16:26:07.381891 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c4e56b-a36f-48a7-b4d7-b962a074d740" path="/var/lib/kubelet/pods/e7c4e56b-a36f-48a7-b4d7-b962a074d740/volumes" Feb 25 16:26:07 crc kubenswrapper[4937]: I0225 16:26:07.691773 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xxx" event={"ID":"671a0a21-c761-422a-beff-41dc8f866615","Type":"ContainerStarted","Data":"1dcc5d77dc84b28316a085b5c4025031ec9b9993683f714bb3955fa63270328e"} Feb 25 16:26:07 crc kubenswrapper[4937]: I0225 16:26:07.717291 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z7xxx" podStartSLOduration=2.292462014 podStartE2EDuration="5.717271733s" podCreationTimestamp="2026-02-25 16:26:02 +0000 UTC" firstStartedPulling="2026-02-25 16:26:03.615772973 +0000 UTC m=+2414.629164863" lastFinishedPulling="2026-02-25 16:26:07.040582672 +0000 UTC m=+2418.053974582" observedRunningTime="2026-02-25 16:26:07.712131425 +0000 UTC m=+2418.725523355" watchObservedRunningTime="2026-02-25 16:26:07.717271733 +0000 UTC m=+2418.730663613" Feb 25 16:26:09 crc kubenswrapper[4937]: I0225 16:26:09.367954 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:26:09 crc kubenswrapper[4937]: E0225 16:26:09.368682 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:26:12 crc kubenswrapper[4937]: I0225 16:26:12.817927 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:12 crc kubenswrapper[4937]: I0225 16:26:12.818392 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:12 crc kubenswrapper[4937]: I0225 16:26:12.888672 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:13 crc kubenswrapper[4937]: I0225 16:26:13.805154 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:13 crc kubenswrapper[4937]: I0225 16:26:13.852414 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z7xxx"] Feb 25 16:26:15 crc kubenswrapper[4937]: I0225 16:26:15.778742 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z7xxx" podUID="671a0a21-c761-422a-beff-41dc8f866615" containerName="registry-server" containerID="cri-o://1dcc5d77dc84b28316a085b5c4025031ec9b9993683f714bb3955fa63270328e" gracePeriod=2 Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.398643 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.505575 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671a0a21-c761-422a-beff-41dc8f866615-utilities\") pod \"671a0a21-c761-422a-beff-41dc8f866615\" (UID: \"671a0a21-c761-422a-beff-41dc8f866615\") " Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.505846 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671a0a21-c761-422a-beff-41dc8f866615-catalog-content\") pod \"671a0a21-c761-422a-beff-41dc8f866615\" (UID: \"671a0a21-c761-422a-beff-41dc8f866615\") " Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.506212 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brhc4\" (UniqueName: \"kubernetes.io/projected/671a0a21-c761-422a-beff-41dc8f866615-kube-api-access-brhc4\") pod \"671a0a21-c761-422a-beff-41dc8f866615\" (UID: \"671a0a21-c761-422a-beff-41dc8f866615\") " Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.507313 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671a0a21-c761-422a-beff-41dc8f866615-utilities" (OuterVolumeSpecName: "utilities") pod "671a0a21-c761-422a-beff-41dc8f866615" (UID: "671a0a21-c761-422a-beff-41dc8f866615"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.511676 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671a0a21-c761-422a-beff-41dc8f866615-kube-api-access-brhc4" (OuterVolumeSpecName: "kube-api-access-brhc4") pod "671a0a21-c761-422a-beff-41dc8f866615" (UID: "671a0a21-c761-422a-beff-41dc8f866615"). InnerVolumeSpecName "kube-api-access-brhc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.555800 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671a0a21-c761-422a-beff-41dc8f866615-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "671a0a21-c761-422a-beff-41dc8f866615" (UID: "671a0a21-c761-422a-beff-41dc8f866615"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.609395 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brhc4\" (UniqueName: \"kubernetes.io/projected/671a0a21-c761-422a-beff-41dc8f866615-kube-api-access-brhc4\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.609430 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671a0a21-c761-422a-beff-41dc8f866615-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.609439 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671a0a21-c761-422a-beff-41dc8f866615-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.792640 4937 generic.go:334] "Generic (PLEG): container finished" podID="671a0a21-c761-422a-beff-41dc8f866615" containerID="1dcc5d77dc84b28316a085b5c4025031ec9b9993683f714bb3955fa63270328e" exitCode=0 Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.792704 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xxx" event={"ID":"671a0a21-c761-422a-beff-41dc8f866615","Type":"ContainerDied","Data":"1dcc5d77dc84b28316a085b5c4025031ec9b9993683f714bb3955fa63270328e"} Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.793015 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7xxx" event={"ID":"671a0a21-c761-422a-beff-41dc8f866615","Type":"ContainerDied","Data":"f0a2ce3c1c592452003c532cf072c06e6430a82cfc789349e20536f3d77b62f0"} Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.793043 4937 scope.go:117] "RemoveContainer" containerID="1dcc5d77dc84b28316a085b5c4025031ec9b9993683f714bb3955fa63270328e" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.792747 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7xxx" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.829537 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z7xxx"] Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.831002 4937 scope.go:117] "RemoveContainer" containerID="7f2ce8d341d7acc932e8ebd40f6a6aee2e6840cd15afec319c374bb7a1e52f9f" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.838990 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z7xxx"] Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.860784 4937 scope.go:117] "RemoveContainer" containerID="41f6b5ccd628a6dc013eeb597df63cbc7d7d1770d5346bd3f091f51a9cbe1c74" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.916994 4937 scope.go:117] "RemoveContainer" containerID="1dcc5d77dc84b28316a085b5c4025031ec9b9993683f714bb3955fa63270328e" Feb 25 16:26:16 crc kubenswrapper[4937]: E0225 16:26:16.917631 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dcc5d77dc84b28316a085b5c4025031ec9b9993683f714bb3955fa63270328e\": container with ID starting with 1dcc5d77dc84b28316a085b5c4025031ec9b9993683f714bb3955fa63270328e not found: ID does not exist" containerID="1dcc5d77dc84b28316a085b5c4025031ec9b9993683f714bb3955fa63270328e" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.917687 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dcc5d77dc84b28316a085b5c4025031ec9b9993683f714bb3955fa63270328e"} err="failed to get container status \"1dcc5d77dc84b28316a085b5c4025031ec9b9993683f714bb3955fa63270328e\": rpc error: code = NotFound desc = could not find container \"1dcc5d77dc84b28316a085b5c4025031ec9b9993683f714bb3955fa63270328e\": container with ID starting with 1dcc5d77dc84b28316a085b5c4025031ec9b9993683f714bb3955fa63270328e not found: ID does not exist" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.917767 4937 scope.go:117] "RemoveContainer" containerID="7f2ce8d341d7acc932e8ebd40f6a6aee2e6840cd15afec319c374bb7a1e52f9f" Feb 25 16:26:16 crc kubenswrapper[4937]: E0225 16:26:16.918367 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f2ce8d341d7acc932e8ebd40f6a6aee2e6840cd15afec319c374bb7a1e52f9f\": container with ID starting with 7f2ce8d341d7acc932e8ebd40f6a6aee2e6840cd15afec319c374bb7a1e52f9f not found: ID does not exist" containerID="7f2ce8d341d7acc932e8ebd40f6a6aee2e6840cd15afec319c374bb7a1e52f9f" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.918408 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2ce8d341d7acc932e8ebd40f6a6aee2e6840cd15afec319c374bb7a1e52f9f"} err="failed to get container status \"7f2ce8d341d7acc932e8ebd40f6a6aee2e6840cd15afec319c374bb7a1e52f9f\": rpc error: code = NotFound desc = could not find container \"7f2ce8d341d7acc932e8ebd40f6a6aee2e6840cd15afec319c374bb7a1e52f9f\": container with ID starting with 7f2ce8d341d7acc932e8ebd40f6a6aee2e6840cd15afec319c374bb7a1e52f9f not found: ID does not exist" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.918438 4937 scope.go:117] "RemoveContainer" containerID="41f6b5ccd628a6dc013eeb597df63cbc7d7d1770d5346bd3f091f51a9cbe1c74" Feb 25 16:26:16 crc kubenswrapper[4937]: E0225 16:26:16.918839 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41f6b5ccd628a6dc013eeb597df63cbc7d7d1770d5346bd3f091f51a9cbe1c74\": container with ID starting with 41f6b5ccd628a6dc013eeb597df63cbc7d7d1770d5346bd3f091f51a9cbe1c74 not found: ID does not exist" containerID="41f6b5ccd628a6dc013eeb597df63cbc7d7d1770d5346bd3f091f51a9cbe1c74" Feb 25 16:26:16 crc kubenswrapper[4937]: I0225 16:26:16.918873 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f6b5ccd628a6dc013eeb597df63cbc7d7d1770d5346bd3f091f51a9cbe1c74"} err="failed to get container status \"41f6b5ccd628a6dc013eeb597df63cbc7d7d1770d5346bd3f091f51a9cbe1c74\": rpc error: code = NotFound desc = could not find container \"41f6b5ccd628a6dc013eeb597df63cbc7d7d1770d5346bd3f091f51a9cbe1c74\": container with ID starting with 41f6b5ccd628a6dc013eeb597df63cbc7d7d1770d5346bd3f091f51a9cbe1c74 not found: ID does not exist" Feb 25 16:26:17 crc kubenswrapper[4937]: I0225 16:26:17.381024 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="671a0a21-c761-422a-beff-41dc8f866615" path="/var/lib/kubelet/pods/671a0a21-c761-422a-beff-41dc8f866615/volumes" Feb 25 16:26:21 crc kubenswrapper[4937]: I0225 16:26:21.373596 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:26:21 crc kubenswrapper[4937]: E0225 16:26:21.374419 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:26:23 crc kubenswrapper[4937]: I0225 16:26:23.050060 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-6zhwd"] Feb 25 16:26:23 crc kubenswrapper[4937]: I0225 16:26:23.060075 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-6zhwd"] Feb 25 16:26:23 crc kubenswrapper[4937]: I0225 16:26:23.411340 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8520984-c620-42eb-ae9d-54a40aa32b55" path="/var/lib/kubelet/pods/b8520984-c620-42eb-ae9d-54a40aa32b55/volumes" Feb 25 16:26:29 crc kubenswrapper[4937]: I0225 16:26:29.044926 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-bfq7k"] Feb 25 16:26:29 crc kubenswrapper[4937]: I0225 16:26:29.065316 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-bfq7k"] Feb 25 16:26:29 crc kubenswrapper[4937]: I0225 16:26:29.383270 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ee73347-7944-41ba-b1ee-26c8c95f3be6" path="/var/lib/kubelet/pods/8ee73347-7944-41ba-b1ee-26c8c95f3be6/volumes" Feb 25 16:26:36 crc kubenswrapper[4937]: I0225 16:26:36.368804 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:26:36 crc kubenswrapper[4937]: E0225 16:26:36.371065 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:26:39 crc kubenswrapper[4937]: I0225 16:26:39.028907 4937 generic.go:334] "Generic (PLEG): container finished" podID="d08f7150-84a3-42bf-bed8-624a7f5e2c35" containerID="42a8f122d451424a56206f9206aa4bce0f0841a66fbd058b4a77dee9ce4f4f1f" exitCode=0 Feb 25 16:26:39 crc kubenswrapper[4937]: I0225 16:26:39.028986 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" event={"ID":"d08f7150-84a3-42bf-bed8-624a7f5e2c35","Type":"ContainerDied","Data":"42a8f122d451424a56206f9206aa4bce0f0841a66fbd058b4a77dee9ce4f4f1f"} Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.607688 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.705979 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.706244 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-telemetry-combined-ca-bundle\") pod \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.706287 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-repo-setup-combined-ca-bundle\") pod \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.706343 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-neutron-metadata-combined-ca-bundle\") pod \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.706361 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.706453 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-inventory\") pod \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.706541 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.706594 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-ssh-key-openstack-edpm-ipam\") pod \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.706632 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-ovn-combined-ca-bundle\") pod \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.706667 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72m5l\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-kube-api-access-72m5l\") pod \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.706719 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-libvirt-combined-ca-bundle\") pod \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.706758 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-bootstrap-combined-ca-bundle\") pod \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.706858 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-ovn-default-certs-0\") pod \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.706874 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-nova-combined-ca-bundle\") pod \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\" (UID: \"d08f7150-84a3-42bf-bed8-624a7f5e2c35\") " Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.713707 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d08f7150-84a3-42bf-bed8-624a7f5e2c35" (UID: "d08f7150-84a3-42bf-bed8-624a7f5e2c35"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.714230 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d08f7150-84a3-42bf-bed8-624a7f5e2c35" (UID: "d08f7150-84a3-42bf-bed8-624a7f5e2c35"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.714370 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "d08f7150-84a3-42bf-bed8-624a7f5e2c35" (UID: "d08f7150-84a3-42bf-bed8-624a7f5e2c35"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.714782 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "d08f7150-84a3-42bf-bed8-624a7f5e2c35" (UID: "d08f7150-84a3-42bf-bed8-624a7f5e2c35"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.715347 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-kube-api-access-72m5l" (OuterVolumeSpecName: "kube-api-access-72m5l") pod "d08f7150-84a3-42bf-bed8-624a7f5e2c35" (UID: "d08f7150-84a3-42bf-bed8-624a7f5e2c35"). InnerVolumeSpecName "kube-api-access-72m5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.715378 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "d08f7150-84a3-42bf-bed8-624a7f5e2c35" (UID: "d08f7150-84a3-42bf-bed8-624a7f5e2c35"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.715465 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d08f7150-84a3-42bf-bed8-624a7f5e2c35" (UID: "d08f7150-84a3-42bf-bed8-624a7f5e2c35"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.716099 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d08f7150-84a3-42bf-bed8-624a7f5e2c35" (UID: "d08f7150-84a3-42bf-bed8-624a7f5e2c35"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.716238 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d08f7150-84a3-42bf-bed8-624a7f5e2c35" (UID: "d08f7150-84a3-42bf-bed8-624a7f5e2c35"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.716756 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d08f7150-84a3-42bf-bed8-624a7f5e2c35" (UID: "d08f7150-84a3-42bf-bed8-624a7f5e2c35"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.717059 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d08f7150-84a3-42bf-bed8-624a7f5e2c35" (UID: "d08f7150-84a3-42bf-bed8-624a7f5e2c35"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.717151 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "d08f7150-84a3-42bf-bed8-624a7f5e2c35" (UID: "d08f7150-84a3-42bf-bed8-624a7f5e2c35"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.747873 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-inventory" (OuterVolumeSpecName: "inventory") pod "d08f7150-84a3-42bf-bed8-624a7f5e2c35" (UID: "d08f7150-84a3-42bf-bed8-624a7f5e2c35"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.751148 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d08f7150-84a3-42bf-bed8-624a7f5e2c35" (UID: "d08f7150-84a3-42bf-bed8-624a7f5e2c35"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.809125 4937 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.809163 4937 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.809175 4937 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.809189 4937 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.809200 4937 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.809474 4937 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.809511 4937 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.809526 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.809540 4937 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.809552 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.809566 4937 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.809576 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72m5l\" (UniqueName: \"kubernetes.io/projected/d08f7150-84a3-42bf-bed8-624a7f5e2c35-kube-api-access-72m5l\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.809607 4937 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:40 crc kubenswrapper[4937]: I0225 16:26:40.809616 4937 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08f7150-84a3-42bf-bed8-624a7f5e2c35-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.056451 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" event={"ID":"d08f7150-84a3-42bf-bed8-624a7f5e2c35","Type":"ContainerDied","Data":"2385f1e4300047131519ff6e535428b3319c27fb120fe55863131976e7986e2a"} Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.056510 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2385f1e4300047131519ff6e535428b3319c27fb120fe55863131976e7986e2a" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.056554 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r74cs" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.118291 4937 scope.go:117] "RemoveContainer" containerID="c004804f9cfdef65508b11f0a69dcc00575b8841b01bbc64b07143b2dfbb8025" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.163723 4937 scope.go:117] "RemoveContainer" containerID="a4a7497aa724f8c0c1e182934a3c79d21817bf8baff829875b9f1ddf561476e7" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.206891 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf"] Feb 25 16:26:41 crc kubenswrapper[4937]: E0225 16:26:41.207409 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08f7150-84a3-42bf-bed8-624a7f5e2c35" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.207429 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08f7150-84a3-42bf-bed8-624a7f5e2c35" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 25 16:26:41 crc kubenswrapper[4937]: E0225 16:26:41.207459 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671a0a21-c761-422a-beff-41dc8f866615" containerName="extract-utilities" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.207467 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="671a0a21-c761-422a-beff-41dc8f866615" containerName="extract-utilities" Feb 25 16:26:41 crc kubenswrapper[4937]: E0225 16:26:41.207508 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671a0a21-c761-422a-beff-41dc8f866615" containerName="extract-content" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.207516 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="671a0a21-c761-422a-beff-41dc8f866615" containerName="extract-content" Feb 25 16:26:41 crc kubenswrapper[4937]: E0225 16:26:41.207531 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671a0a21-c761-422a-beff-41dc8f866615" containerName="registry-server" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.207539 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="671a0a21-c761-422a-beff-41dc8f866615" containerName="registry-server" Feb 25 16:26:41 crc kubenswrapper[4937]: E0225 16:26:41.207553 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17608ad1-2ff4-446f-af20-a509268ac519" containerName="oc" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.207562 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="17608ad1-2ff4-446f-af20-a509268ac519" containerName="oc" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.207837 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="17608ad1-2ff4-446f-af20-a509268ac519" containerName="oc" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.207858 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="d08f7150-84a3-42bf-bed8-624a7f5e2c35" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.207894 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="671a0a21-c761-422a-beff-41dc8f866615" containerName="registry-server" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.208795 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.213051 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.213372 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.213650 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.213871 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.214280 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.219721 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf"] Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.232244 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.232356 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcfl4\" (UniqueName: \"kubernetes.io/projected/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-kube-api-access-bcfl4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.232760 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.232843 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.232898 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.258607 4937 scope.go:117] "RemoveContainer" containerID="10d3a46390a867296e3caaeb51b08e16c958248b99eadff95dd7ece58428663b" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.334301 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.334352 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.334394 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.334429 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.334465 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcfl4\" (UniqueName: \"kubernetes.io/projected/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-kube-api-access-bcfl4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.336392 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.340033 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.342285 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.342667 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.354514 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcfl4\" (UniqueName: \"kubernetes.io/projected/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-kube-api-access-bcfl4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4mrhf\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:41 crc kubenswrapper[4937]: I0225 16:26:41.534253 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:26:42 crc kubenswrapper[4937]: I0225 16:26:42.092780 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf"] Feb 25 16:26:42 crc kubenswrapper[4937]: W0225 16:26:42.096151 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf8ba197_d732_4514_9b22_4d2aa6f5a7f6.slice/crio-209ab48384b71ee39e8bfc7b0c100927fae4bed7fe819b256631c7f98de92aca WatchSource:0}: Error finding container 209ab48384b71ee39e8bfc7b0c100927fae4bed7fe819b256631c7f98de92aca: Status 404 returned error can't find the container with id 209ab48384b71ee39e8bfc7b0c100927fae4bed7fe819b256631c7f98de92aca Feb 25 16:26:43 crc kubenswrapper[4937]: I0225 16:26:43.077396 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" event={"ID":"af8ba197-d732-4514-9b22-4d2aa6f5a7f6","Type":"ContainerStarted","Data":"9d964d3035659b135557ae24ec0070f671f8234689537ed5dd4c6b3ab77b087d"} Feb 25 16:26:43 crc kubenswrapper[4937]: I0225 16:26:43.077896 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" event={"ID":"af8ba197-d732-4514-9b22-4d2aa6f5a7f6","Type":"ContainerStarted","Data":"209ab48384b71ee39e8bfc7b0c100927fae4bed7fe819b256631c7f98de92aca"} Feb 25 16:26:43 crc kubenswrapper[4937]: I0225 16:26:43.099661 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" podStartSLOduration=1.434951254 podStartE2EDuration="2.099643876s" podCreationTimestamp="2026-02-25 16:26:41 +0000 UTC" firstStartedPulling="2026-02-25 16:26:42.099006008 +0000 UTC m=+2453.112397908" lastFinishedPulling="2026-02-25 16:26:42.76369859 +0000 UTC m=+2453.777090530" observedRunningTime="2026-02-25 16:26:43.094973809 +0000 UTC m=+2454.108365719" watchObservedRunningTime="2026-02-25 16:26:43.099643876 +0000 UTC m=+2454.113035766" Feb 25 16:26:49 crc kubenswrapper[4937]: I0225 16:26:49.367600 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:26:49 crc kubenswrapper[4937]: E0225 16:26:49.369406 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:27:04 crc kubenswrapper[4937]: I0225 16:27:04.368346 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:27:04 crc kubenswrapper[4937]: E0225 16:27:04.370106 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:27:18 crc kubenswrapper[4937]: I0225 16:27:18.367395 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:27:18 crc kubenswrapper[4937]: E0225 16:27:18.368146 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:27:30 crc kubenswrapper[4937]: I0225 16:27:30.369260 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:27:30 crc kubenswrapper[4937]: E0225 16:27:30.372817 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:27:41 crc kubenswrapper[4937]: I0225 16:27:41.384790 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:27:41 crc kubenswrapper[4937]: E0225 16:27:41.385954 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:27:42 crc kubenswrapper[4937]: E0225 16:27:42.094854 4937 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf8ba197_d732_4514_9b22_4d2aa6f5a7f6.slice/crio-9d964d3035659b135557ae24ec0070f671f8234689537ed5dd4c6b3ab77b087d.scope\": RecentStats: unable to find data in memory cache]" Feb 25 16:27:42 crc kubenswrapper[4937]: I0225 16:27:42.740407 4937 generic.go:334] "Generic (PLEG): container finished" podID="af8ba197-d732-4514-9b22-4d2aa6f5a7f6" containerID="9d964d3035659b135557ae24ec0070f671f8234689537ed5dd4c6b3ab77b087d" exitCode=0 Feb 25 16:27:42 crc kubenswrapper[4937]: I0225 16:27:42.740519 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" event={"ID":"af8ba197-d732-4514-9b22-4d2aa6f5a7f6","Type":"ContainerDied","Data":"9d964d3035659b135557ae24ec0070f671f8234689537ed5dd4c6b3ab77b087d"} Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.288173 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.378285 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcfl4\" (UniqueName: \"kubernetes.io/projected/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-kube-api-access-bcfl4\") pod \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.378809 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ovn-combined-ca-bundle\") pod \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.378847 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ovncontroller-config-0\") pod \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.378876 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ssh-key-openstack-edpm-ipam\") pod \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.378901 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-inventory\") pod \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\" (UID: \"af8ba197-d732-4514-9b22-4d2aa6f5a7f6\") " Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.384021 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "af8ba197-d732-4514-9b22-4d2aa6f5a7f6" (UID: "af8ba197-d732-4514-9b22-4d2aa6f5a7f6"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.384329 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-kube-api-access-bcfl4" (OuterVolumeSpecName: "kube-api-access-bcfl4") pod "af8ba197-d732-4514-9b22-4d2aa6f5a7f6" (UID: "af8ba197-d732-4514-9b22-4d2aa6f5a7f6"). InnerVolumeSpecName "kube-api-access-bcfl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.406979 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "af8ba197-d732-4514-9b22-4d2aa6f5a7f6" (UID: "af8ba197-d732-4514-9b22-4d2aa6f5a7f6"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.416072 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-inventory" (OuterVolumeSpecName: "inventory") pod "af8ba197-d732-4514-9b22-4d2aa6f5a7f6" (UID: "af8ba197-d732-4514-9b22-4d2aa6f5a7f6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.433306 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "af8ba197-d732-4514-9b22-4d2aa6f5a7f6" (UID: "af8ba197-d732-4514-9b22-4d2aa6f5a7f6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.481729 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.481762 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.481772 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcfl4\" (UniqueName: \"kubernetes.io/projected/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-kube-api-access-bcfl4\") on node \"crc\" DevicePath \"\"" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.481780 4937 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.481789 4937 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af8ba197-d732-4514-9b22-4d2aa6f5a7f6-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.768231 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" event={"ID":"af8ba197-d732-4514-9b22-4d2aa6f5a7f6","Type":"ContainerDied","Data":"209ab48384b71ee39e8bfc7b0c100927fae4bed7fe819b256631c7f98de92aca"} Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.768290 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="209ab48384b71ee39e8bfc7b0c100927fae4bed7fe819b256631c7f98de92aca" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.768349 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4mrhf" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.858144 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss"] Feb 25 16:27:44 crc kubenswrapper[4937]: E0225 16:27:44.858641 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8ba197-d732-4514-9b22-4d2aa6f5a7f6" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.858662 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8ba197-d732-4514-9b22-4d2aa6f5a7f6" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.858950 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8ba197-d732-4514-9b22-4d2aa6f5a7f6" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.859986 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.862774 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.863196 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.863556 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.863798 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.866302 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.866890 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.873202 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss"] Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.999108 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.999198 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdfhl\" (UniqueName: \"kubernetes.io/projected/9851d2ed-9455-4797-bcad-ed3b82909df5-kube-api-access-hdfhl\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.999538 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.999724 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:44 crc kubenswrapper[4937]: I0225 16:27:44.999782 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.000146 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.102379 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.102454 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.102545 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdfhl\" (UniqueName: \"kubernetes.io/projected/9851d2ed-9455-4797-bcad-ed3b82909df5-kube-api-access-hdfhl\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.102671 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.102738 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.102767 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.106768 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.107004 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.108696 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.109116 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.112421 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.122617 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdfhl\" (UniqueName: \"kubernetes.io/projected/9851d2ed-9455-4797-bcad-ed3b82909df5-kube-api-access-hdfhl\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.180449 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.744641 4937 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.747933 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss"] Feb 25 16:27:45 crc kubenswrapper[4937]: I0225 16:27:45.816137 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" event={"ID":"9851d2ed-9455-4797-bcad-ed3b82909df5","Type":"ContainerStarted","Data":"d1e5ca8ec4d5f7b3ec541eec158dfbd1f5294aa8968a3ae0dd167217c72e64e7"} Feb 25 16:27:46 crc kubenswrapper[4937]: I0225 16:27:46.825468 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" event={"ID":"9851d2ed-9455-4797-bcad-ed3b82909df5","Type":"ContainerStarted","Data":"e0f91f60d2cd7dc10b66eea72aab6af93750d9fc46bc4335cc7fc75923d52650"} Feb 25 16:27:46 crc kubenswrapper[4937]: I0225 16:27:46.849101 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" podStartSLOduration=2.381951467 podStartE2EDuration="2.849083129s" podCreationTimestamp="2026-02-25 16:27:44 +0000 UTC" firstStartedPulling="2026-02-25 16:27:45.744258991 +0000 UTC m=+2516.757650921" lastFinishedPulling="2026-02-25 16:27:46.211390693 +0000 UTC m=+2517.224782583" observedRunningTime="2026-02-25 16:27:46.842300229 +0000 UTC m=+2517.855692119" watchObservedRunningTime="2026-02-25 16:27:46.849083129 +0000 UTC m=+2517.862475019" Feb 25 16:27:54 crc kubenswrapper[4937]: I0225 16:27:54.369851 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:27:54 crc kubenswrapper[4937]: E0225 16:27:54.370630 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:28:00 crc kubenswrapper[4937]: I0225 16:28:00.161954 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533948-dd2gz"] Feb 25 16:28:00 crc kubenswrapper[4937]: I0225 16:28:00.164644 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533948-dd2gz" Feb 25 16:28:00 crc kubenswrapper[4937]: I0225 16:28:00.167028 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:28:00 crc kubenswrapper[4937]: I0225 16:28:00.167474 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:28:00 crc kubenswrapper[4937]: I0225 16:28:00.174650 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:28:00 crc kubenswrapper[4937]: I0225 16:28:00.184534 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533948-dd2gz"] Feb 25 16:28:00 crc kubenswrapper[4937]: I0225 16:28:00.251906 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lr8s\" (UniqueName: \"kubernetes.io/projected/598a8b71-943c-4906-842e-95a601cb6dc9-kube-api-access-2lr8s\") pod \"auto-csr-approver-29533948-dd2gz\" (UID: \"598a8b71-943c-4906-842e-95a601cb6dc9\") " pod="openshift-infra/auto-csr-approver-29533948-dd2gz" Feb 25 16:28:00 crc kubenswrapper[4937]: I0225 16:28:00.354983 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lr8s\" (UniqueName: \"kubernetes.io/projected/598a8b71-943c-4906-842e-95a601cb6dc9-kube-api-access-2lr8s\") pod \"auto-csr-approver-29533948-dd2gz\" (UID: \"598a8b71-943c-4906-842e-95a601cb6dc9\") " pod="openshift-infra/auto-csr-approver-29533948-dd2gz" Feb 25 16:28:00 crc kubenswrapper[4937]: I0225 16:28:00.374138 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lr8s\" (UniqueName: \"kubernetes.io/projected/598a8b71-943c-4906-842e-95a601cb6dc9-kube-api-access-2lr8s\") pod \"auto-csr-approver-29533948-dd2gz\" (UID: \"598a8b71-943c-4906-842e-95a601cb6dc9\") " pod="openshift-infra/auto-csr-approver-29533948-dd2gz" Feb 25 16:28:00 crc kubenswrapper[4937]: I0225 16:28:00.493400 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533948-dd2gz" Feb 25 16:28:00 crc kubenswrapper[4937]: I0225 16:28:00.967045 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533948-dd2gz"] Feb 25 16:28:00 crc kubenswrapper[4937]: I0225 16:28:00.979380 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533948-dd2gz" event={"ID":"598a8b71-943c-4906-842e-95a601cb6dc9","Type":"ContainerStarted","Data":"e42fdd2c617a14f6d51be8fe7b1c935a892474a186d418f3a228bfef85add0b1"} Feb 25 16:28:02 crc kubenswrapper[4937]: E0225 16:28:02.681311 4937 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod598a8b71_943c_4906_842e_95a601cb6dc9.slice/crio-conmon-abaaa8f0cd3c809d47542f02231dd5a6bb0229f212b06519def845720407bb52.scope\": RecentStats: unable to find data in memory cache]" Feb 25 16:28:03 crc kubenswrapper[4937]: I0225 16:28:03.000744 4937 generic.go:334] "Generic (PLEG): container finished" podID="598a8b71-943c-4906-842e-95a601cb6dc9" containerID="abaaa8f0cd3c809d47542f02231dd5a6bb0229f212b06519def845720407bb52" exitCode=0 Feb 25 16:28:03 crc kubenswrapper[4937]: I0225 16:28:03.000882 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533948-dd2gz" event={"ID":"598a8b71-943c-4906-842e-95a601cb6dc9","Type":"ContainerDied","Data":"abaaa8f0cd3c809d47542f02231dd5a6bb0229f212b06519def845720407bb52"} Feb 25 16:28:04 crc kubenswrapper[4937]: I0225 16:28:04.522861 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533948-dd2gz" Feb 25 16:28:04 crc kubenswrapper[4937]: I0225 16:28:04.675281 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lr8s\" (UniqueName: \"kubernetes.io/projected/598a8b71-943c-4906-842e-95a601cb6dc9-kube-api-access-2lr8s\") pod \"598a8b71-943c-4906-842e-95a601cb6dc9\" (UID: \"598a8b71-943c-4906-842e-95a601cb6dc9\") " Feb 25 16:28:04 crc kubenswrapper[4937]: I0225 16:28:04.682129 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598a8b71-943c-4906-842e-95a601cb6dc9-kube-api-access-2lr8s" (OuterVolumeSpecName: "kube-api-access-2lr8s") pod "598a8b71-943c-4906-842e-95a601cb6dc9" (UID: "598a8b71-943c-4906-842e-95a601cb6dc9"). InnerVolumeSpecName "kube-api-access-2lr8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:28:04 crc kubenswrapper[4937]: I0225 16:28:04.785802 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lr8s\" (UniqueName: \"kubernetes.io/projected/598a8b71-943c-4906-842e-95a601cb6dc9-kube-api-access-2lr8s\") on node \"crc\" DevicePath \"\"" Feb 25 16:28:05 crc kubenswrapper[4937]: I0225 16:28:05.026391 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533948-dd2gz" event={"ID":"598a8b71-943c-4906-842e-95a601cb6dc9","Type":"ContainerDied","Data":"e42fdd2c617a14f6d51be8fe7b1c935a892474a186d418f3a228bfef85add0b1"} Feb 25 16:28:05 crc kubenswrapper[4937]: I0225 16:28:05.026439 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e42fdd2c617a14f6d51be8fe7b1c935a892474a186d418f3a228bfef85add0b1" Feb 25 16:28:05 crc kubenswrapper[4937]: I0225 16:28:05.026522 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533948-dd2gz" Feb 25 16:28:05 crc kubenswrapper[4937]: I0225 16:28:05.369572 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:28:05 crc kubenswrapper[4937]: E0225 16:28:05.370145 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:28:05 crc kubenswrapper[4937]: I0225 16:28:05.600104 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533942-kqhw5"] Feb 25 16:28:05 crc kubenswrapper[4937]: I0225 16:28:05.609417 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533942-kqhw5"] Feb 25 16:28:07 crc kubenswrapper[4937]: I0225 16:28:07.382455 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7fbac47-0089-4e3d-99c6-33efb288a426" path="/var/lib/kubelet/pods/c7fbac47-0089-4e3d-99c6-33efb288a426/volumes" Feb 25 16:28:18 crc kubenswrapper[4937]: I0225 16:28:18.367764 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:28:18 crc kubenswrapper[4937]: E0225 16:28:18.368528 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:28:29 crc kubenswrapper[4937]: I0225 16:28:29.368720 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:28:29 crc kubenswrapper[4937]: E0225 16:28:29.369723 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:28:31 crc kubenswrapper[4937]: I0225 16:28:31.303845 4937 generic.go:334] "Generic (PLEG): container finished" podID="9851d2ed-9455-4797-bcad-ed3b82909df5" containerID="e0f91f60d2cd7dc10b66eea72aab6af93750d9fc46bc4335cc7fc75923d52650" exitCode=0 Feb 25 16:28:31 crc kubenswrapper[4937]: I0225 16:28:31.304093 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" event={"ID":"9851d2ed-9455-4797-bcad-ed3b82909df5","Type":"ContainerDied","Data":"e0f91f60d2cd7dc10b66eea72aab6af93750d9fc46bc4335cc7fc75923d52650"} Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.048068 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.249290 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-ssh-key-openstack-edpm-ipam\") pod \"9851d2ed-9455-4797-bcad-ed3b82909df5\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.250283 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdfhl\" (UniqueName: \"kubernetes.io/projected/9851d2ed-9455-4797-bcad-ed3b82909df5-kube-api-access-hdfhl\") pod \"9851d2ed-9455-4797-bcad-ed3b82909df5\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.250345 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-neutron-metadata-combined-ca-bundle\") pod \"9851d2ed-9455-4797-bcad-ed3b82909df5\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.250370 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9851d2ed-9455-4797-bcad-ed3b82909df5\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.250399 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-nova-metadata-neutron-config-0\") pod \"9851d2ed-9455-4797-bcad-ed3b82909df5\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.250449 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-inventory\") pod \"9851d2ed-9455-4797-bcad-ed3b82909df5\" (UID: \"9851d2ed-9455-4797-bcad-ed3b82909df5\") " Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.255761 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9851d2ed-9455-4797-bcad-ed3b82909df5-kube-api-access-hdfhl" (OuterVolumeSpecName: "kube-api-access-hdfhl") pod "9851d2ed-9455-4797-bcad-ed3b82909df5" (UID: "9851d2ed-9455-4797-bcad-ed3b82909df5"). InnerVolumeSpecName "kube-api-access-hdfhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.261353 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9851d2ed-9455-4797-bcad-ed3b82909df5" (UID: "9851d2ed-9455-4797-bcad-ed3b82909df5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.281417 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9851d2ed-9455-4797-bcad-ed3b82909df5" (UID: "9851d2ed-9455-4797-bcad-ed3b82909df5"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.295511 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9851d2ed-9455-4797-bcad-ed3b82909df5" (UID: "9851d2ed-9455-4797-bcad-ed3b82909df5"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.301439 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-inventory" (OuterVolumeSpecName: "inventory") pod "9851d2ed-9455-4797-bcad-ed3b82909df5" (UID: "9851d2ed-9455-4797-bcad-ed3b82909df5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.312463 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9851d2ed-9455-4797-bcad-ed3b82909df5" (UID: "9851d2ed-9455-4797-bcad-ed3b82909df5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.324660 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" event={"ID":"9851d2ed-9455-4797-bcad-ed3b82909df5","Type":"ContainerDied","Data":"d1e5ca8ec4d5f7b3ec541eec158dfbd1f5294aa8968a3ae0dd167217c72e64e7"} Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.324699 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.324705 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1e5ca8ec4d5f7b3ec541eec158dfbd1f5294aa8968a3ae0dd167217c72e64e7" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.357948 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdfhl\" (UniqueName: \"kubernetes.io/projected/9851d2ed-9455-4797-bcad-ed3b82909df5-kube-api-access-hdfhl\") on node \"crc\" DevicePath \"\"" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.357994 4937 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.358012 4937 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.358027 4937 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.358040 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.358051 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9851d2ed-9455-4797-bcad-ed3b82909df5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.453810 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct"] Feb 25 16:28:33 crc kubenswrapper[4937]: E0225 16:28:33.454336 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598a8b71-943c-4906-842e-95a601cb6dc9" containerName="oc" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.454362 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="598a8b71-943c-4906-842e-95a601cb6dc9" containerName="oc" Feb 25 16:28:33 crc kubenswrapper[4937]: E0225 16:28:33.454382 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9851d2ed-9455-4797-bcad-ed3b82909df5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.454393 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="9851d2ed-9455-4797-bcad-ed3b82909df5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.454676 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="9851d2ed-9455-4797-bcad-ed3b82909df5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.454721 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="598a8b71-943c-4906-842e-95a601cb6dc9" containerName="oc" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.455630 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.465649 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct"] Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.470664 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.470726 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4jkp\" (UniqueName: \"kubernetes.io/projected/b1499078-381f-48bd-bcfb-c9bd057fa5d2-kube-api-access-w4jkp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.470817 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.470890 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.470927 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.502757 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.502827 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.503063 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.503174 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.503067 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.572963 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.573352 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.573384 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.573439 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.573498 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4jkp\" (UniqueName: \"kubernetes.io/projected/b1499078-381f-48bd-bcfb-c9bd057fa5d2-kube-api-access-w4jkp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.592686 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.593943 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.594765 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.595322 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4jkp\" (UniqueName: \"kubernetes.io/projected/b1499078-381f-48bd-bcfb-c9bd057fa5d2-kube-api-access-w4jkp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.596462 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bzxct\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:33 crc kubenswrapper[4937]: I0225 16:28:33.823561 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:28:34 crc kubenswrapper[4937]: I0225 16:28:34.384779 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct"] Feb 25 16:28:34 crc kubenswrapper[4937]: W0225 16:28:34.390986 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1499078_381f_48bd_bcfb_c9bd057fa5d2.slice/crio-f624b88cf4ae2920553a48796c0c12b6253845c7ba20b99f88a3ab2a8662ab4e WatchSource:0}: Error finding container f624b88cf4ae2920553a48796c0c12b6253845c7ba20b99f88a3ab2a8662ab4e: Status 404 returned error can't find the container with id f624b88cf4ae2920553a48796c0c12b6253845c7ba20b99f88a3ab2a8662ab4e Feb 25 16:28:35 crc kubenswrapper[4937]: I0225 16:28:35.357224 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" event={"ID":"b1499078-381f-48bd-bcfb-c9bd057fa5d2","Type":"ContainerStarted","Data":"c814729060cf49e9209127683d80d630ae10cd333acfc4a9dbc75e1ab62da0ba"} Feb 25 16:28:35 crc kubenswrapper[4937]: I0225 16:28:35.359070 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" event={"ID":"b1499078-381f-48bd-bcfb-c9bd057fa5d2","Type":"ContainerStarted","Data":"f624b88cf4ae2920553a48796c0c12b6253845c7ba20b99f88a3ab2a8662ab4e"} Feb 25 16:28:41 crc kubenswrapper[4937]: I0225 16:28:41.413383 4937 scope.go:117] "RemoveContainer" containerID="620fcd4241f5384fb706d1ba9a49c4ada7f8eb59dbd0647d795ac1d8db850525" Feb 25 16:28:42 crc kubenswrapper[4937]: I0225 16:28:42.367763 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:28:42 crc kubenswrapper[4937]: E0225 16:28:42.368735 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:28:57 crc kubenswrapper[4937]: I0225 16:28:57.367979 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:28:57 crc kubenswrapper[4937]: E0225 16:28:57.369046 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:29:08 crc kubenswrapper[4937]: I0225 16:29:08.367764 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:29:08 crc kubenswrapper[4937]: E0225 16:29:08.368824 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:29:23 crc kubenswrapper[4937]: I0225 16:29:23.369271 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:29:23 crc kubenswrapper[4937]: E0225 16:29:23.370021 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:29:35 crc kubenswrapper[4937]: I0225 16:29:35.367893 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:29:35 crc kubenswrapper[4937]: E0225 16:29:35.368718 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:29:47 crc kubenswrapper[4937]: I0225 16:29:47.368624 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:29:47 crc kubenswrapper[4937]: E0225 16:29:47.370906 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.149955 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" podStartSLOduration=86.612071917 podStartE2EDuration="1m27.149933933s" podCreationTimestamp="2026-02-25 16:28:33 +0000 UTC" firstStartedPulling="2026-02-25 16:28:34.39288008 +0000 UTC m=+2565.406271970" lastFinishedPulling="2026-02-25 16:28:34.930742096 +0000 UTC m=+2565.944133986" observedRunningTime="2026-02-25 16:28:35.388428237 +0000 UTC m=+2566.401820187" watchObservedRunningTime="2026-02-25 16:30:00.149933933 +0000 UTC m=+2651.163325823" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.153064 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533950-dzf9p"] Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.155367 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533950-dzf9p" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.161165 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.161368 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.161541 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.162122 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz"] Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.164438 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.166744 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.167337 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.176567 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533950-dzf9p"] Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.190805 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz"] Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.345019 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wshtv\" (UniqueName: \"kubernetes.io/projected/f0662160-887f-42d1-873b-605e522adf02-kube-api-access-wshtv\") pod \"auto-csr-approver-29533950-dzf9p\" (UID: \"f0662160-887f-42d1-873b-605e522adf02\") " pod="openshift-infra/auto-csr-approver-29533950-dzf9p" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.345093 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-secret-volume\") pod \"collect-profiles-29533950-xqjvz\" (UID: \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.345119 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-config-volume\") pod \"collect-profiles-29533950-xqjvz\" (UID: \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.345218 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkclv\" (UniqueName: \"kubernetes.io/projected/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-kube-api-access-zkclv\") pod \"collect-profiles-29533950-xqjvz\" (UID: \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.367972 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:30:00 crc kubenswrapper[4937]: E0225 16:30:00.368334 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.447705 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wshtv\" (UniqueName: \"kubernetes.io/projected/f0662160-887f-42d1-873b-605e522adf02-kube-api-access-wshtv\") pod \"auto-csr-approver-29533950-dzf9p\" (UID: \"f0662160-887f-42d1-873b-605e522adf02\") " pod="openshift-infra/auto-csr-approver-29533950-dzf9p" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.447779 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-secret-volume\") pod \"collect-profiles-29533950-xqjvz\" (UID: \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.447803 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-config-volume\") pod \"collect-profiles-29533950-xqjvz\" (UID: \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.448000 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkclv\" (UniqueName: \"kubernetes.io/projected/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-kube-api-access-zkclv\") pod \"collect-profiles-29533950-xqjvz\" (UID: \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.449512 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-config-volume\") pod \"collect-profiles-29533950-xqjvz\" (UID: \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.459526 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-secret-volume\") pod \"collect-profiles-29533950-xqjvz\" (UID: \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.465684 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkclv\" (UniqueName: \"kubernetes.io/projected/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-kube-api-access-zkclv\") pod \"collect-profiles-29533950-xqjvz\" (UID: \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.468022 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wshtv\" (UniqueName: \"kubernetes.io/projected/f0662160-887f-42d1-873b-605e522adf02-kube-api-access-wshtv\") pod \"auto-csr-approver-29533950-dzf9p\" (UID: \"f0662160-887f-42d1-873b-605e522adf02\") " pod="openshift-infra/auto-csr-approver-29533950-dzf9p" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.490074 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533950-dzf9p" Feb 25 16:30:00 crc kubenswrapper[4937]: I0225 16:30:00.506641 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" Feb 25 16:30:01 crc kubenswrapper[4937]: I0225 16:30:01.014054 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533950-dzf9p"] Feb 25 16:30:01 crc kubenswrapper[4937]: I0225 16:30:01.117833 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz"] Feb 25 16:30:01 crc kubenswrapper[4937]: W0225 16:30:01.122564 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4453c26d_cf2a_49bf_ab1f_3fc7391a92d3.slice/crio-bf2673405af3bca04cd5cc17a17441e561d42f8d3a84f98a446211e0788ebfaf WatchSource:0}: Error finding container bf2673405af3bca04cd5cc17a17441e561d42f8d3a84f98a446211e0788ebfaf: Status 404 returned error can't find the container with id bf2673405af3bca04cd5cc17a17441e561d42f8d3a84f98a446211e0788ebfaf Feb 25 16:30:01 crc kubenswrapper[4937]: I0225 16:30:01.322139 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533950-dzf9p" event={"ID":"f0662160-887f-42d1-873b-605e522adf02","Type":"ContainerStarted","Data":"71a94a5a7bae92a62e2e3bd8e99beebe01889d749ff920fe4eca3d26aefe4e88"} Feb 25 16:30:01 crc kubenswrapper[4937]: I0225 16:30:01.324168 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" event={"ID":"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3","Type":"ContainerStarted","Data":"93d6ccfa1650be143d17020fbccb622c8012366e8ede0cf9efdfece4159b0338"} Feb 25 16:30:01 crc kubenswrapper[4937]: I0225 16:30:01.324220 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" event={"ID":"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3","Type":"ContainerStarted","Data":"bf2673405af3bca04cd5cc17a17441e561d42f8d3a84f98a446211e0788ebfaf"} Feb 25 16:30:01 crc kubenswrapper[4937]: I0225 16:30:01.348722 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" podStartSLOduration=1.348697034 podStartE2EDuration="1.348697034s" podCreationTimestamp="2026-02-25 16:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:30:01.340875379 +0000 UTC m=+2652.354267269" watchObservedRunningTime="2026-02-25 16:30:01.348697034 +0000 UTC m=+2652.362088924" Feb 25 16:30:02 crc kubenswrapper[4937]: I0225 16:30:02.333705 4937 generic.go:334] "Generic (PLEG): container finished" podID="4453c26d-cf2a-49bf-ab1f-3fc7391a92d3" containerID="93d6ccfa1650be143d17020fbccb622c8012366e8ede0cf9efdfece4159b0338" exitCode=0 Feb 25 16:30:02 crc kubenswrapper[4937]: I0225 16:30:02.333975 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" event={"ID":"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3","Type":"ContainerDied","Data":"93d6ccfa1650be143d17020fbccb622c8012366e8ede0cf9efdfece4159b0338"} Feb 25 16:30:03 crc kubenswrapper[4937]: I0225 16:30:03.351812 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533950-dzf9p" event={"ID":"f0662160-887f-42d1-873b-605e522adf02","Type":"ContainerStarted","Data":"d762949242ad0fcf565988d9777d577fe1149ce5167cecc9a73948a2a8b30f20"} Feb 25 16:30:03 crc kubenswrapper[4937]: I0225 16:30:03.385801 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533950-dzf9p" podStartSLOduration=1.4251871280000001 podStartE2EDuration="3.385777909s" podCreationTimestamp="2026-02-25 16:30:00 +0000 UTC" firstStartedPulling="2026-02-25 16:30:01.01836765 +0000 UTC m=+2652.031759540" lastFinishedPulling="2026-02-25 16:30:02.978958431 +0000 UTC m=+2653.992350321" observedRunningTime="2026-02-25 16:30:03.372806274 +0000 UTC m=+2654.386198164" watchObservedRunningTime="2026-02-25 16:30:03.385777909 +0000 UTC m=+2654.399169799" Feb 25 16:30:03 crc kubenswrapper[4937]: I0225 16:30:03.868253 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.027769 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-config-volume\") pod \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\" (UID: \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\") " Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.027937 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkclv\" (UniqueName: \"kubernetes.io/projected/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-kube-api-access-zkclv\") pod \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\" (UID: \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\") " Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.028117 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-secret-volume\") pod \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\" (UID: \"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3\") " Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.028641 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-config-volume" (OuterVolumeSpecName: "config-volume") pod "4453c26d-cf2a-49bf-ab1f-3fc7391a92d3" (UID: "4453c26d-cf2a-49bf-ab1f-3fc7391a92d3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.028787 4937 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.033455 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-kube-api-access-zkclv" (OuterVolumeSpecName: "kube-api-access-zkclv") pod "4453c26d-cf2a-49bf-ab1f-3fc7391a92d3" (UID: "4453c26d-cf2a-49bf-ab1f-3fc7391a92d3"). InnerVolumeSpecName "kube-api-access-zkclv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.037629 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4453c26d-cf2a-49bf-ab1f-3fc7391a92d3" (UID: "4453c26d-cf2a-49bf-ab1f-3fc7391a92d3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.130511 4937 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.130820 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkclv\" (UniqueName: \"kubernetes.io/projected/4453c26d-cf2a-49bf-ab1f-3fc7391a92d3-kube-api-access-zkclv\") on node \"crc\" DevicePath \"\"" Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.388688 4937 generic.go:334] "Generic (PLEG): container finished" podID="f0662160-887f-42d1-873b-605e522adf02" containerID="d762949242ad0fcf565988d9777d577fe1149ce5167cecc9a73948a2a8b30f20" exitCode=0 Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.389045 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533950-dzf9p" event={"ID":"f0662160-887f-42d1-873b-605e522adf02","Type":"ContainerDied","Data":"d762949242ad0fcf565988d9777d577fe1149ce5167cecc9a73948a2a8b30f20"} Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.396441 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" event={"ID":"4453c26d-cf2a-49bf-ab1f-3fc7391a92d3","Type":"ContainerDied","Data":"bf2673405af3bca04cd5cc17a17441e561d42f8d3a84f98a446211e0788ebfaf"} Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.396494 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf2673405af3bca04cd5cc17a17441e561d42f8d3a84f98a446211e0788ebfaf" Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.396572 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533950-xqjvz" Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.431622 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw"] Feb 25 16:30:04 crc kubenswrapper[4937]: I0225 16:30:04.440382 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533905-jckrw"] Feb 25 16:30:05 crc kubenswrapper[4937]: I0225 16:30:05.388870 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3b2c333-3db5-4de3-bcc1-944dfc35b2b3" path="/var/lib/kubelet/pods/d3b2c333-3db5-4de3-bcc1-944dfc35b2b3/volumes" Feb 25 16:30:05 crc kubenswrapper[4937]: I0225 16:30:05.866391 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533950-dzf9p" Feb 25 16:30:05 crc kubenswrapper[4937]: I0225 16:30:05.967049 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wshtv\" (UniqueName: \"kubernetes.io/projected/f0662160-887f-42d1-873b-605e522adf02-kube-api-access-wshtv\") pod \"f0662160-887f-42d1-873b-605e522adf02\" (UID: \"f0662160-887f-42d1-873b-605e522adf02\") " Feb 25 16:30:05 crc kubenswrapper[4937]: I0225 16:30:05.975917 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0662160-887f-42d1-873b-605e522adf02-kube-api-access-wshtv" (OuterVolumeSpecName: "kube-api-access-wshtv") pod "f0662160-887f-42d1-873b-605e522adf02" (UID: "f0662160-887f-42d1-873b-605e522adf02"). InnerVolumeSpecName "kube-api-access-wshtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:30:06 crc kubenswrapper[4937]: I0225 16:30:06.069729 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wshtv\" (UniqueName: \"kubernetes.io/projected/f0662160-887f-42d1-873b-605e522adf02-kube-api-access-wshtv\") on node \"crc\" DevicePath \"\"" Feb 25 16:30:06 crc kubenswrapper[4937]: I0225 16:30:06.420138 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533950-dzf9p" event={"ID":"f0662160-887f-42d1-873b-605e522adf02","Type":"ContainerDied","Data":"71a94a5a7bae92a62e2e3bd8e99beebe01889d749ff920fe4eca3d26aefe4e88"} Feb 25 16:30:06 crc kubenswrapper[4937]: I0225 16:30:06.420183 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71a94a5a7bae92a62e2e3bd8e99beebe01889d749ff920fe4eca3d26aefe4e88" Feb 25 16:30:06 crc kubenswrapper[4937]: I0225 16:30:06.420207 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533950-dzf9p" Feb 25 16:30:06 crc kubenswrapper[4937]: I0225 16:30:06.480113 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533944-z7jph"] Feb 25 16:30:06 crc kubenswrapper[4937]: I0225 16:30:06.537791 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533944-z7jph"] Feb 25 16:30:07 crc kubenswrapper[4937]: I0225 16:30:07.387544 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cdd70e5-860d-4b0c-9f36-5727ac62a4ba" path="/var/lib/kubelet/pods/0cdd70e5-860d-4b0c-9f36-5727ac62a4ba/volumes" Feb 25 16:30:11 crc kubenswrapper[4937]: I0225 16:30:11.375076 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:30:11 crc kubenswrapper[4937]: E0225 16:30:11.376419 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:30:26 crc kubenswrapper[4937]: I0225 16:30:26.368402 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:30:26 crc kubenswrapper[4937]: E0225 16:30:26.369110 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:30:41 crc kubenswrapper[4937]: I0225 16:30:41.376007 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:30:41 crc kubenswrapper[4937]: E0225 16:30:41.377035 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:30:41 crc kubenswrapper[4937]: I0225 16:30:41.554291 4937 scope.go:117] "RemoveContainer" containerID="48d01a04fa661c299849ab0a5e5e6fdd42337fce0475e6e6a26fe17d302c978d" Feb 25 16:30:41 crc kubenswrapper[4937]: I0225 16:30:41.586862 4937 scope.go:117] "RemoveContainer" containerID="2d6b61db3d154e32ceab1a3129ef56d78a5745b9e13feb2efffbcc615a14978d" Feb 25 16:30:53 crc kubenswrapper[4937]: I0225 16:30:53.368290 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:30:53 crc kubenswrapper[4937]: I0225 16:30:53.956317 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"14bc3b494b958d91b2a8bca2f2d6a7e4ab3c7c93997b6a4481ec5bf58785334b"} Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.166898 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533952-fbfsk"] Feb 25 16:32:00 crc kubenswrapper[4937]: E0225 16:32:00.168654 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4453c26d-cf2a-49bf-ab1f-3fc7391a92d3" containerName="collect-profiles" Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.168678 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4453c26d-cf2a-49bf-ab1f-3fc7391a92d3" containerName="collect-profiles" Feb 25 16:32:00 crc kubenswrapper[4937]: E0225 16:32:00.168699 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0662160-887f-42d1-873b-605e522adf02" containerName="oc" Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.168711 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0662160-887f-42d1-873b-605e522adf02" containerName="oc" Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.169067 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0662160-887f-42d1-873b-605e522adf02" containerName="oc" Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.169085 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="4453c26d-cf2a-49bf-ab1f-3fc7391a92d3" containerName="collect-profiles" Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.170794 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533952-fbfsk" Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.172917 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.173743 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.173917 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.183684 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533952-fbfsk"] Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.316442 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkgf5\" (UniqueName: \"kubernetes.io/projected/9d355edc-ac55-4d29-9ece-deb38974fdbc-kube-api-access-qkgf5\") pod \"auto-csr-approver-29533952-fbfsk\" (UID: \"9d355edc-ac55-4d29-9ece-deb38974fdbc\") " pod="openshift-infra/auto-csr-approver-29533952-fbfsk" Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.419417 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkgf5\" (UniqueName: \"kubernetes.io/projected/9d355edc-ac55-4d29-9ece-deb38974fdbc-kube-api-access-qkgf5\") pod \"auto-csr-approver-29533952-fbfsk\" (UID: \"9d355edc-ac55-4d29-9ece-deb38974fdbc\") " pod="openshift-infra/auto-csr-approver-29533952-fbfsk" Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.439563 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkgf5\" (UniqueName: \"kubernetes.io/projected/9d355edc-ac55-4d29-9ece-deb38974fdbc-kube-api-access-qkgf5\") pod \"auto-csr-approver-29533952-fbfsk\" (UID: \"9d355edc-ac55-4d29-9ece-deb38974fdbc\") " pod="openshift-infra/auto-csr-approver-29533952-fbfsk" Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.498347 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533952-fbfsk" Feb 25 16:32:00 crc kubenswrapper[4937]: I0225 16:32:00.992756 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533952-fbfsk"] Feb 25 16:32:01 crc kubenswrapper[4937]: I0225 16:32:01.706467 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533952-fbfsk" event={"ID":"9d355edc-ac55-4d29-9ece-deb38974fdbc","Type":"ContainerStarted","Data":"5213b7308abfc989093ae97efffdc033aef3e2d24dfb5b038b74fdad97083714"} Feb 25 16:32:02 crc kubenswrapper[4937]: I0225 16:32:02.716615 4937 generic.go:334] "Generic (PLEG): container finished" podID="9d355edc-ac55-4d29-9ece-deb38974fdbc" containerID="120ad2dd64c86f8b0f145c84ff5da0c3a3649a29f6f0351dc0bf211404443128" exitCode=0 Feb 25 16:32:02 crc kubenswrapper[4937]: I0225 16:32:02.716718 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533952-fbfsk" event={"ID":"9d355edc-ac55-4d29-9ece-deb38974fdbc","Type":"ContainerDied","Data":"120ad2dd64c86f8b0f145c84ff5da0c3a3649a29f6f0351dc0bf211404443128"} Feb 25 16:32:04 crc kubenswrapper[4937]: I0225 16:32:04.245397 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533952-fbfsk" Feb 25 16:32:04 crc kubenswrapper[4937]: I0225 16:32:04.344190 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkgf5\" (UniqueName: \"kubernetes.io/projected/9d355edc-ac55-4d29-9ece-deb38974fdbc-kube-api-access-qkgf5\") pod \"9d355edc-ac55-4d29-9ece-deb38974fdbc\" (UID: \"9d355edc-ac55-4d29-9ece-deb38974fdbc\") " Feb 25 16:32:04 crc kubenswrapper[4937]: I0225 16:32:04.357767 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d355edc-ac55-4d29-9ece-deb38974fdbc-kube-api-access-qkgf5" (OuterVolumeSpecName: "kube-api-access-qkgf5") pod "9d355edc-ac55-4d29-9ece-deb38974fdbc" (UID: "9d355edc-ac55-4d29-9ece-deb38974fdbc"). InnerVolumeSpecName "kube-api-access-qkgf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:32:04 crc kubenswrapper[4937]: I0225 16:32:04.446783 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkgf5\" (UniqueName: \"kubernetes.io/projected/9d355edc-ac55-4d29-9ece-deb38974fdbc-kube-api-access-qkgf5\") on node \"crc\" DevicePath \"\"" Feb 25 16:32:04 crc kubenswrapper[4937]: I0225 16:32:04.741703 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533952-fbfsk" event={"ID":"9d355edc-ac55-4d29-9ece-deb38974fdbc","Type":"ContainerDied","Data":"5213b7308abfc989093ae97efffdc033aef3e2d24dfb5b038b74fdad97083714"} Feb 25 16:32:04 crc kubenswrapper[4937]: I0225 16:32:04.741762 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5213b7308abfc989093ae97efffdc033aef3e2d24dfb5b038b74fdad97083714" Feb 25 16:32:04 crc kubenswrapper[4937]: I0225 16:32:04.742199 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533952-fbfsk" Feb 25 16:32:05 crc kubenswrapper[4937]: I0225 16:32:05.335906 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533946-gs99v"] Feb 25 16:32:05 crc kubenswrapper[4937]: I0225 16:32:05.344366 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533946-gs99v"] Feb 25 16:32:05 crc kubenswrapper[4937]: I0225 16:32:05.381125 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17608ad1-2ff4-446f-af20-a509268ac519" path="/var/lib/kubelet/pods/17608ad1-2ff4-446f-af20-a509268ac519/volumes" Feb 25 16:32:25 crc kubenswrapper[4937]: I0225 16:32:25.953924 4937 generic.go:334] "Generic (PLEG): container finished" podID="b1499078-381f-48bd-bcfb-c9bd057fa5d2" containerID="c814729060cf49e9209127683d80d630ae10cd333acfc4a9dbc75e1ab62da0ba" exitCode=0 Feb 25 16:32:25 crc kubenswrapper[4937]: I0225 16:32:25.954056 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" event={"ID":"b1499078-381f-48bd-bcfb-c9bd057fa5d2","Type":"ContainerDied","Data":"c814729060cf49e9209127683d80d630ae10cd333acfc4a9dbc75e1ab62da0ba"} Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.518474 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.557673 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-libvirt-secret-0\") pod \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.558221 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-ssh-key-openstack-edpm-ipam\") pod \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.558933 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-inventory\") pod \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.559110 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-libvirt-combined-ca-bundle\") pod \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.559315 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4jkp\" (UniqueName: \"kubernetes.io/projected/b1499078-381f-48bd-bcfb-c9bd057fa5d2-kube-api-access-w4jkp\") pod \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\" (UID: \"b1499078-381f-48bd-bcfb-c9bd057fa5d2\") " Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.563973 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1499078-381f-48bd-bcfb-c9bd057fa5d2-kube-api-access-w4jkp" (OuterVolumeSpecName: "kube-api-access-w4jkp") pod "b1499078-381f-48bd-bcfb-c9bd057fa5d2" (UID: "b1499078-381f-48bd-bcfb-c9bd057fa5d2"). InnerVolumeSpecName "kube-api-access-w4jkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.566714 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b1499078-381f-48bd-bcfb-c9bd057fa5d2" (UID: "b1499078-381f-48bd-bcfb-c9bd057fa5d2"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.589879 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-inventory" (OuterVolumeSpecName: "inventory") pod "b1499078-381f-48bd-bcfb-c9bd057fa5d2" (UID: "b1499078-381f-48bd-bcfb-c9bd057fa5d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.592917 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b1499078-381f-48bd-bcfb-c9bd057fa5d2" (UID: "b1499078-381f-48bd-bcfb-c9bd057fa5d2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.600353 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b1499078-381f-48bd-bcfb-c9bd057fa5d2" (UID: "b1499078-381f-48bd-bcfb-c9bd057fa5d2"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.662578 4937 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.662613 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4jkp\" (UniqueName: \"kubernetes.io/projected/b1499078-381f-48bd-bcfb-c9bd057fa5d2-kube-api-access-w4jkp\") on node \"crc\" DevicePath \"\"" Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.662622 4937 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.662630 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.662639 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1499078-381f-48bd-bcfb-c9bd057fa5d2-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.977543 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" event={"ID":"b1499078-381f-48bd-bcfb-c9bd057fa5d2","Type":"ContainerDied","Data":"f624b88cf4ae2920553a48796c0c12b6253845c7ba20b99f88a3ab2a8662ab4e"} Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.977833 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f624b88cf4ae2920553a48796c0c12b6253845c7ba20b99f88a3ab2a8662ab4e" Feb 25 16:32:27 crc kubenswrapper[4937]: I0225 16:32:27.977659 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bzxct" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.091094 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb"] Feb 25 16:32:28 crc kubenswrapper[4937]: E0225 16:32:28.091602 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1499078-381f-48bd-bcfb-c9bd057fa5d2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.091625 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1499078-381f-48bd-bcfb-c9bd057fa5d2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 25 16:32:28 crc kubenswrapper[4937]: E0225 16:32:28.091678 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d355edc-ac55-4d29-9ece-deb38974fdbc" containerName="oc" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.091687 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d355edc-ac55-4d29-9ece-deb38974fdbc" containerName="oc" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.091957 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1499078-381f-48bd-bcfb-c9bd057fa5d2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.091996 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d355edc-ac55-4d29-9ece-deb38974fdbc" containerName="oc" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.092923 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.097301 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.097593 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.097417 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.097528 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.098243 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.100198 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.097789 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.107773 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb"] Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.171806 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j9pr\" (UniqueName: \"kubernetes.io/projected/dbc3ffd6-39f1-4130-9083-033d890d558d-kube-api-access-5j9pr\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.171871 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.171915 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.171947 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.171978 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.172000 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.172168 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.172239 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.172322 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.172416 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.172448 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.275671 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.275813 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.276109 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.276812 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.276943 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.277021 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.277148 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.277355 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.277414 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.277860 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j9pr\" (UniqueName: \"kubernetes.io/projected/dbc3ffd6-39f1-4130-9083-033d890d558d-kube-api-access-5j9pr\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.277974 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.278411 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.281840 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.282088 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.282150 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.283503 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.285051 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.286985 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.287476 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.289139 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.296636 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.302072 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j9pr\" (UniqueName: \"kubernetes.io/projected/dbc3ffd6-39f1-4130-9083-033d890d558d-kube-api-access-5j9pr\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9w4zb\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.416073 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.936905 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb"] Feb 25 16:32:28 crc kubenswrapper[4937]: I0225 16:32:28.989461 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" event={"ID":"dbc3ffd6-39f1-4130-9083-033d890d558d","Type":"ContainerStarted","Data":"fedeecc279a1b83872de1e6d44a2a837e1a392d709a6bcecc9838fd77bf75657"} Feb 25 16:32:30 crc kubenswrapper[4937]: I0225 16:32:30.003199 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" event={"ID":"dbc3ffd6-39f1-4130-9083-033d890d558d","Type":"ContainerStarted","Data":"57bf0f31c4a0e26aa91f89eed74461196fb58101a8d8a166affb3419e5f1ccb3"} Feb 25 16:32:30 crc kubenswrapper[4937]: I0225 16:32:30.036060 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" podStartSLOduration=1.4334438440000001 podStartE2EDuration="2.036034506s" podCreationTimestamp="2026-02-25 16:32:28 +0000 UTC" firstStartedPulling="2026-02-25 16:32:28.947719236 +0000 UTC m=+2799.961111126" lastFinishedPulling="2026-02-25 16:32:29.550309898 +0000 UTC m=+2800.563701788" observedRunningTime="2026-02-25 16:32:30.021946413 +0000 UTC m=+2801.035338333" watchObservedRunningTime="2026-02-25 16:32:30.036034506 +0000 UTC m=+2801.049426416" Feb 25 16:32:41 crc kubenswrapper[4937]: I0225 16:32:41.772668 4937 scope.go:117] "RemoveContainer" containerID="b376e5ac0917e5b867cd3dddeb69853e1432bac988429aafd30e073dadee3fd5" Feb 25 16:33:11 crc kubenswrapper[4937]: I0225 16:33:11.494895 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:33:11 crc kubenswrapper[4937]: I0225 16:33:11.495577 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:33:41 crc kubenswrapper[4937]: I0225 16:33:41.495094 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:33:41 crc kubenswrapper[4937]: I0225 16:33:41.495728 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:34:00 crc kubenswrapper[4937]: I0225 16:34:00.146600 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533954-x95fl"] Feb 25 16:34:00 crc kubenswrapper[4937]: I0225 16:34:00.149624 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533954-x95fl" Feb 25 16:34:00 crc kubenswrapper[4937]: I0225 16:34:00.154832 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:34:00 crc kubenswrapper[4937]: I0225 16:34:00.154875 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:34:00 crc kubenswrapper[4937]: I0225 16:34:00.155131 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:34:00 crc kubenswrapper[4937]: I0225 16:34:00.161843 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533954-x95fl"] Feb 25 16:34:00 crc kubenswrapper[4937]: I0225 16:34:00.219661 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd9ld\" (UniqueName: \"kubernetes.io/projected/a6fae4fd-13ae-41d5-820a-90e4b0ebaae5-kube-api-access-qd9ld\") pod \"auto-csr-approver-29533954-x95fl\" (UID: \"a6fae4fd-13ae-41d5-820a-90e4b0ebaae5\") " pod="openshift-infra/auto-csr-approver-29533954-x95fl" Feb 25 16:34:00 crc kubenswrapper[4937]: I0225 16:34:00.322088 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd9ld\" (UniqueName: \"kubernetes.io/projected/a6fae4fd-13ae-41d5-820a-90e4b0ebaae5-kube-api-access-qd9ld\") pod \"auto-csr-approver-29533954-x95fl\" (UID: \"a6fae4fd-13ae-41d5-820a-90e4b0ebaae5\") " pod="openshift-infra/auto-csr-approver-29533954-x95fl" Feb 25 16:34:00 crc kubenswrapper[4937]: I0225 16:34:00.348578 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd9ld\" (UniqueName: \"kubernetes.io/projected/a6fae4fd-13ae-41d5-820a-90e4b0ebaae5-kube-api-access-qd9ld\") pod \"auto-csr-approver-29533954-x95fl\" (UID: \"a6fae4fd-13ae-41d5-820a-90e4b0ebaae5\") " pod="openshift-infra/auto-csr-approver-29533954-x95fl" Feb 25 16:34:00 crc kubenswrapper[4937]: I0225 16:34:00.472733 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533954-x95fl" Feb 25 16:34:00 crc kubenswrapper[4937]: I0225 16:34:00.980377 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533954-x95fl"] Feb 25 16:34:00 crc kubenswrapper[4937]: I0225 16:34:00.981924 4937 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 16:34:01 crc kubenswrapper[4937]: I0225 16:34:01.071375 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533954-x95fl" event={"ID":"a6fae4fd-13ae-41d5-820a-90e4b0ebaae5","Type":"ContainerStarted","Data":"976fae278dd455502b1657b20d70cf513941c3a208fd6f5d3a874542a66d3eed"} Feb 25 16:34:03 crc kubenswrapper[4937]: I0225 16:34:03.097247 4937 generic.go:334] "Generic (PLEG): container finished" podID="a6fae4fd-13ae-41d5-820a-90e4b0ebaae5" containerID="3d6b4ab9385d09ae3e498c8979fbd74da96b60bb98e0e2b7f49047e24f5e6225" exitCode=0 Feb 25 16:34:03 crc kubenswrapper[4937]: I0225 16:34:03.097818 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533954-x95fl" event={"ID":"a6fae4fd-13ae-41d5-820a-90e4b0ebaae5","Type":"ContainerDied","Data":"3d6b4ab9385d09ae3e498c8979fbd74da96b60bb98e0e2b7f49047e24f5e6225"} Feb 25 16:34:04 crc kubenswrapper[4937]: I0225 16:34:04.593771 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533954-x95fl" Feb 25 16:34:04 crc kubenswrapper[4937]: I0225 16:34:04.715338 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd9ld\" (UniqueName: \"kubernetes.io/projected/a6fae4fd-13ae-41d5-820a-90e4b0ebaae5-kube-api-access-qd9ld\") pod \"a6fae4fd-13ae-41d5-820a-90e4b0ebaae5\" (UID: \"a6fae4fd-13ae-41d5-820a-90e4b0ebaae5\") " Feb 25 16:34:04 crc kubenswrapper[4937]: I0225 16:34:04.740730 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6fae4fd-13ae-41d5-820a-90e4b0ebaae5-kube-api-access-qd9ld" (OuterVolumeSpecName: "kube-api-access-qd9ld") pod "a6fae4fd-13ae-41d5-820a-90e4b0ebaae5" (UID: "a6fae4fd-13ae-41d5-820a-90e4b0ebaae5"). InnerVolumeSpecName "kube-api-access-qd9ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:34:04 crc kubenswrapper[4937]: I0225 16:34:04.818819 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd9ld\" (UniqueName: \"kubernetes.io/projected/a6fae4fd-13ae-41d5-820a-90e4b0ebaae5-kube-api-access-qd9ld\") on node \"crc\" DevicePath \"\"" Feb 25 16:34:05 crc kubenswrapper[4937]: I0225 16:34:05.125429 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533954-x95fl" event={"ID":"a6fae4fd-13ae-41d5-820a-90e4b0ebaae5","Type":"ContainerDied","Data":"976fae278dd455502b1657b20d70cf513941c3a208fd6f5d3a874542a66d3eed"} Feb 25 16:34:05 crc kubenswrapper[4937]: I0225 16:34:05.125515 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="976fae278dd455502b1657b20d70cf513941c3a208fd6f5d3a874542a66d3eed" Feb 25 16:34:05 crc kubenswrapper[4937]: I0225 16:34:05.125637 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533954-x95fl" Feb 25 16:34:05 crc kubenswrapper[4937]: I0225 16:34:05.672704 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533948-dd2gz"] Feb 25 16:34:05 crc kubenswrapper[4937]: I0225 16:34:05.691365 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533948-dd2gz"] Feb 25 16:34:07 crc kubenswrapper[4937]: I0225 16:34:07.384811 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598a8b71-943c-4906-842e-95a601cb6dc9" path="/var/lib/kubelet/pods/598a8b71-943c-4906-842e-95a601cb6dc9/volumes" Feb 25 16:34:11 crc kubenswrapper[4937]: I0225 16:34:11.494932 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:34:11 crc kubenswrapper[4937]: I0225 16:34:11.495511 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:34:11 crc kubenswrapper[4937]: I0225 16:34:11.495554 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 16:34:11 crc kubenswrapper[4937]: I0225 16:34:11.496903 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14bc3b494b958d91b2a8bca2f2d6a7e4ab3c7c93997b6a4481ec5bf58785334b"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 16:34:11 crc kubenswrapper[4937]: I0225 16:34:11.496961 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://14bc3b494b958d91b2a8bca2f2d6a7e4ab3c7c93997b6a4481ec5bf58785334b" gracePeriod=600 Feb 25 16:34:12 crc kubenswrapper[4937]: I0225 16:34:12.193304 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="14bc3b494b958d91b2a8bca2f2d6a7e4ab3c7c93997b6a4481ec5bf58785334b" exitCode=0 Feb 25 16:34:12 crc kubenswrapper[4937]: I0225 16:34:12.193370 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"14bc3b494b958d91b2a8bca2f2d6a7e4ab3c7c93997b6a4481ec5bf58785334b"} Feb 25 16:34:12 crc kubenswrapper[4937]: I0225 16:34:12.193854 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d"} Feb 25 16:34:12 crc kubenswrapper[4937]: I0225 16:34:12.193905 4937 scope.go:117] "RemoveContainer" containerID="e730a675f556cd7720cb088115a31c5a58c6fcf6d276dee910a7041cde3bf8f3" Feb 25 16:34:41 crc kubenswrapper[4937]: I0225 16:34:41.888445 4937 scope.go:117] "RemoveContainer" containerID="abaaa8f0cd3c809d47542f02231dd5a6bb0229f212b06519def845720407bb52" Feb 25 16:34:47 crc kubenswrapper[4937]: I0225 16:34:47.565936 4937 generic.go:334] "Generic (PLEG): container finished" podID="dbc3ffd6-39f1-4130-9083-033d890d558d" containerID="57bf0f31c4a0e26aa91f89eed74461196fb58101a8d8a166affb3419e5f1ccb3" exitCode=0 Feb 25 16:34:47 crc kubenswrapper[4937]: I0225 16:34:47.566023 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" event={"ID":"dbc3ffd6-39f1-4130-9083-033d890d558d","Type":"ContainerDied","Data":"57bf0f31c4a0e26aa91f89eed74461196fb58101a8d8a166affb3419e5f1ccb3"} Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.146804 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.291739 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-extra-config-0\") pod \"dbc3ffd6-39f1-4130-9083-033d890d558d\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.291806 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j9pr\" (UniqueName: \"kubernetes.io/projected/dbc3ffd6-39f1-4130-9083-033d890d558d-kube-api-access-5j9pr\") pod \"dbc3ffd6-39f1-4130-9083-033d890d558d\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.291838 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-3\") pod \"dbc3ffd6-39f1-4130-9083-033d890d558d\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.291875 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-2\") pod \"dbc3ffd6-39f1-4130-9083-033d890d558d\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.291915 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-ssh-key-openstack-edpm-ipam\") pod \"dbc3ffd6-39f1-4130-9083-033d890d558d\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.291968 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-migration-ssh-key-1\") pod \"dbc3ffd6-39f1-4130-9083-033d890d558d\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.292005 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-combined-ca-bundle\") pod \"dbc3ffd6-39f1-4130-9083-033d890d558d\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.292043 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-1\") pod \"dbc3ffd6-39f1-4130-9083-033d890d558d\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.292111 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-migration-ssh-key-0\") pod \"dbc3ffd6-39f1-4130-9083-033d890d558d\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.292145 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-0\") pod \"dbc3ffd6-39f1-4130-9083-033d890d558d\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.292194 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-inventory\") pod \"dbc3ffd6-39f1-4130-9083-033d890d558d\" (UID: \"dbc3ffd6-39f1-4130-9083-033d890d558d\") " Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.310666 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "dbc3ffd6-39f1-4130-9083-033d890d558d" (UID: "dbc3ffd6-39f1-4130-9083-033d890d558d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.337665 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbc3ffd6-39f1-4130-9083-033d890d558d-kube-api-access-5j9pr" (OuterVolumeSpecName: "kube-api-access-5j9pr") pod "dbc3ffd6-39f1-4130-9083-033d890d558d" (UID: "dbc3ffd6-39f1-4130-9083-033d890d558d"). InnerVolumeSpecName "kube-api-access-5j9pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.346713 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "dbc3ffd6-39f1-4130-9083-033d890d558d" (UID: "dbc3ffd6-39f1-4130-9083-033d890d558d"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.349033 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-inventory" (OuterVolumeSpecName: "inventory") pod "dbc3ffd6-39f1-4130-9083-033d890d558d" (UID: "dbc3ffd6-39f1-4130-9083-033d890d558d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.353736 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "dbc3ffd6-39f1-4130-9083-033d890d558d" (UID: "dbc3ffd6-39f1-4130-9083-033d890d558d"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.355468 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "dbc3ffd6-39f1-4130-9083-033d890d558d" (UID: "dbc3ffd6-39f1-4130-9083-033d890d558d"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.365667 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dbc3ffd6-39f1-4130-9083-033d890d558d" (UID: "dbc3ffd6-39f1-4130-9083-033d890d558d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.365708 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "dbc3ffd6-39f1-4130-9083-033d890d558d" (UID: "dbc3ffd6-39f1-4130-9083-033d890d558d"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.369693 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "dbc3ffd6-39f1-4130-9083-033d890d558d" (UID: "dbc3ffd6-39f1-4130-9083-033d890d558d"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.370302 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "dbc3ffd6-39f1-4130-9083-033d890d558d" (UID: "dbc3ffd6-39f1-4130-9083-033d890d558d"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.387630 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "dbc3ffd6-39f1-4130-9083-033d890d558d" (UID: "dbc3ffd6-39f1-4130-9083-033d890d558d"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.394559 4937 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.394746 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j9pr\" (UniqueName: \"kubernetes.io/projected/dbc3ffd6-39f1-4130-9083-033d890d558d-kube-api-access-5j9pr\") on node \"crc\" DevicePath \"\"" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.394812 4937 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.394899 4937 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.394961 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.395017 4937 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.395104 4937 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.395158 4937 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.395216 4937 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.395280 4937 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.395337 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbc3ffd6-39f1-4130-9083-033d890d558d-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.597520 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" event={"ID":"dbc3ffd6-39f1-4130-9083-033d890d558d","Type":"ContainerDied","Data":"fedeecc279a1b83872de1e6d44a2a837e1a392d709a6bcecc9838fd77bf75657"} Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.597814 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fedeecc279a1b83872de1e6d44a2a837e1a392d709a6bcecc9838fd77bf75657" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.597613 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9w4zb" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.753531 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr"] Feb 25 16:34:49 crc kubenswrapper[4937]: E0225 16:34:49.754048 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6fae4fd-13ae-41d5-820a-90e4b0ebaae5" containerName="oc" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.754070 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6fae4fd-13ae-41d5-820a-90e4b0ebaae5" containerName="oc" Feb 25 16:34:49 crc kubenswrapper[4937]: E0225 16:34:49.754098 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc3ffd6-39f1-4130-9083-033d890d558d" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.754106 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc3ffd6-39f1-4130-9083-033d890d558d" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.754341 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6fae4fd-13ae-41d5-820a-90e4b0ebaae5" containerName="oc" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.754367 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc3ffd6-39f1-4130-9083-033d890d558d" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.755354 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.758739 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.758977 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.759162 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mmwdk" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.760121 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.772106 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr"] Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.776833 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.905739 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.905788 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snq9q\" (UniqueName: \"kubernetes.io/projected/a6ef0688-25f8-4018-8976-30334bf11136-kube-api-access-snq9q\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.905819 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.905952 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.905985 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.906035 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:49 crc kubenswrapper[4937]: I0225 16:34:49.906072 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.007988 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.008323 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snq9q\" (UniqueName: \"kubernetes.io/projected/a6ef0688-25f8-4018-8976-30334bf11136-kube-api-access-snq9q\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.008534 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.008802 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.008991 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.009212 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.009401 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.011795 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.011802 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.013144 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.014677 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.015845 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.018050 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.027832 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snq9q\" (UniqueName: \"kubernetes.io/projected/a6ef0688-25f8-4018-8976-30334bf11136-kube-api-access-snq9q\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g87qr\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.094585 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:34:50 crc kubenswrapper[4937]: I0225 16:34:50.699426 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr"] Feb 25 16:34:51 crc kubenswrapper[4937]: I0225 16:34:51.619196 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" event={"ID":"a6ef0688-25f8-4018-8976-30334bf11136","Type":"ContainerStarted","Data":"76a17dda0b2009c6cfbffd3130900c5cb3c317881ac20303ea968d0239b4ec65"} Feb 25 16:34:51 crc kubenswrapper[4937]: I0225 16:34:51.619807 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" event={"ID":"a6ef0688-25f8-4018-8976-30334bf11136","Type":"ContainerStarted","Data":"161772e560b6805171d92ff8b88c56c4332ecb2415f1e0653e4baf7c6f93b2ee"} Feb 25 16:34:51 crc kubenswrapper[4937]: I0225 16:34:51.640584 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" podStartSLOduration=2.209762995 podStartE2EDuration="2.640566338s" podCreationTimestamp="2026-02-25 16:34:49 +0000 UTC" firstStartedPulling="2026-02-25 16:34:50.704797326 +0000 UTC m=+2941.718189216" lastFinishedPulling="2026-02-25 16:34:51.135600669 +0000 UTC m=+2942.148992559" observedRunningTime="2026-02-25 16:34:51.639727367 +0000 UTC m=+2942.653119297" watchObservedRunningTime="2026-02-25 16:34:51.640566338 +0000 UTC m=+2942.653958228" Feb 25 16:36:00 crc kubenswrapper[4937]: I0225 16:36:00.164404 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533956-wgbtn"] Feb 25 16:36:00 crc kubenswrapper[4937]: I0225 16:36:00.166733 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533956-wgbtn" Feb 25 16:36:00 crc kubenswrapper[4937]: I0225 16:36:00.168971 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:36:00 crc kubenswrapper[4937]: I0225 16:36:00.169037 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:36:00 crc kubenswrapper[4937]: I0225 16:36:00.171051 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:36:00 crc kubenswrapper[4937]: I0225 16:36:00.183007 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533956-wgbtn"] Feb 25 16:36:00 crc kubenswrapper[4937]: I0225 16:36:00.309935 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bzr9\" (UniqueName: \"kubernetes.io/projected/43c48c19-789b-4324-9f62-e7d421eb8595-kube-api-access-6bzr9\") pod \"auto-csr-approver-29533956-wgbtn\" (UID: \"43c48c19-789b-4324-9f62-e7d421eb8595\") " pod="openshift-infra/auto-csr-approver-29533956-wgbtn" Feb 25 16:36:00 crc kubenswrapper[4937]: I0225 16:36:00.412497 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bzr9\" (UniqueName: \"kubernetes.io/projected/43c48c19-789b-4324-9f62-e7d421eb8595-kube-api-access-6bzr9\") pod \"auto-csr-approver-29533956-wgbtn\" (UID: \"43c48c19-789b-4324-9f62-e7d421eb8595\") " pod="openshift-infra/auto-csr-approver-29533956-wgbtn" Feb 25 16:36:00 crc kubenswrapper[4937]: I0225 16:36:00.437398 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bzr9\" (UniqueName: \"kubernetes.io/projected/43c48c19-789b-4324-9f62-e7d421eb8595-kube-api-access-6bzr9\") pod \"auto-csr-approver-29533956-wgbtn\" (UID: \"43c48c19-789b-4324-9f62-e7d421eb8595\") " pod="openshift-infra/auto-csr-approver-29533956-wgbtn" Feb 25 16:36:00 crc kubenswrapper[4937]: I0225 16:36:00.493511 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533956-wgbtn" Feb 25 16:36:00 crc kubenswrapper[4937]: I0225 16:36:00.982508 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533956-wgbtn"] Feb 25 16:36:01 crc kubenswrapper[4937]: I0225 16:36:01.309098 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533956-wgbtn" event={"ID":"43c48c19-789b-4324-9f62-e7d421eb8595","Type":"ContainerStarted","Data":"3c308e5a9b0e0e8f08eb0f67e8a33a9852d8dcaae4142734cdccd3f4e6a46c09"} Feb 25 16:36:02 crc kubenswrapper[4937]: I0225 16:36:02.321771 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533956-wgbtn" event={"ID":"43c48c19-789b-4324-9f62-e7d421eb8595","Type":"ContainerStarted","Data":"a28f6ebff7ba476bb4bd8474e60db715854742225d3aadaba9533a9f9ad4835e"} Feb 25 16:36:02 crc kubenswrapper[4937]: I0225 16:36:02.342619 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533956-wgbtn" podStartSLOduration=1.447193714 podStartE2EDuration="2.342591694s" podCreationTimestamp="2026-02-25 16:36:00 +0000 UTC" firstStartedPulling="2026-02-25 16:36:00.982695528 +0000 UTC m=+3011.996087418" lastFinishedPulling="2026-02-25 16:36:01.878093508 +0000 UTC m=+3012.891485398" observedRunningTime="2026-02-25 16:36:02.334815309 +0000 UTC m=+3013.348207199" watchObservedRunningTime="2026-02-25 16:36:02.342591694 +0000 UTC m=+3013.355983604" Feb 25 16:36:03 crc kubenswrapper[4937]: I0225 16:36:03.335577 4937 generic.go:334] "Generic (PLEG): container finished" podID="43c48c19-789b-4324-9f62-e7d421eb8595" containerID="a28f6ebff7ba476bb4bd8474e60db715854742225d3aadaba9533a9f9ad4835e" exitCode=0 Feb 25 16:36:03 crc kubenswrapper[4937]: I0225 16:36:03.335700 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533956-wgbtn" event={"ID":"43c48c19-789b-4324-9f62-e7d421eb8595","Type":"ContainerDied","Data":"a28f6ebff7ba476bb4bd8474e60db715854742225d3aadaba9533a9f9ad4835e"} Feb 25 16:36:04 crc kubenswrapper[4937]: I0225 16:36:04.806868 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533956-wgbtn" Feb 25 16:36:04 crc kubenswrapper[4937]: I0225 16:36:04.906710 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bzr9\" (UniqueName: \"kubernetes.io/projected/43c48c19-789b-4324-9f62-e7d421eb8595-kube-api-access-6bzr9\") pod \"43c48c19-789b-4324-9f62-e7d421eb8595\" (UID: \"43c48c19-789b-4324-9f62-e7d421eb8595\") " Feb 25 16:36:04 crc kubenswrapper[4937]: I0225 16:36:04.920776 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43c48c19-789b-4324-9f62-e7d421eb8595-kube-api-access-6bzr9" (OuterVolumeSpecName: "kube-api-access-6bzr9") pod "43c48c19-789b-4324-9f62-e7d421eb8595" (UID: "43c48c19-789b-4324-9f62-e7d421eb8595"). InnerVolumeSpecName "kube-api-access-6bzr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:36:05 crc kubenswrapper[4937]: I0225 16:36:05.009433 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bzr9\" (UniqueName: \"kubernetes.io/projected/43c48c19-789b-4324-9f62-e7d421eb8595-kube-api-access-6bzr9\") on node \"crc\" DevicePath \"\"" Feb 25 16:36:05 crc kubenswrapper[4937]: I0225 16:36:05.376441 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533956-wgbtn" Feb 25 16:36:05 crc kubenswrapper[4937]: I0225 16:36:05.396584 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533956-wgbtn" event={"ID":"43c48c19-789b-4324-9f62-e7d421eb8595","Type":"ContainerDied","Data":"3c308e5a9b0e0e8f08eb0f67e8a33a9852d8dcaae4142734cdccd3f4e6a46c09"} Feb 25 16:36:05 crc kubenswrapper[4937]: I0225 16:36:05.396623 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c308e5a9b0e0e8f08eb0f67e8a33a9852d8dcaae4142734cdccd3f4e6a46c09" Feb 25 16:36:05 crc kubenswrapper[4937]: I0225 16:36:05.878556 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533950-dzf9p"] Feb 25 16:36:05 crc kubenswrapper[4937]: I0225 16:36:05.889311 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533950-dzf9p"] Feb 25 16:36:07 crc kubenswrapper[4937]: I0225 16:36:07.383800 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0662160-887f-42d1-873b-605e522adf02" path="/var/lib/kubelet/pods/f0662160-887f-42d1-873b-605e522adf02/volumes" Feb 25 16:36:11 crc kubenswrapper[4937]: I0225 16:36:11.495290 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:36:11 crc kubenswrapper[4937]: I0225 16:36:11.495833 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:36:41 crc kubenswrapper[4937]: I0225 16:36:41.495320 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:36:41 crc kubenswrapper[4937]: I0225 16:36:41.496242 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:36:41 crc kubenswrapper[4937]: I0225 16:36:41.991259 4937 scope.go:117] "RemoveContainer" containerID="d762949242ad0fcf565988d9777d577fe1149ce5167cecc9a73948a2a8b30f20" Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.562449 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4z7sl"] Feb 25 16:36:52 crc kubenswrapper[4937]: E0225 16:36:52.563273 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43c48c19-789b-4324-9f62-e7d421eb8595" containerName="oc" Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.563288 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="43c48c19-789b-4324-9f62-e7d421eb8595" containerName="oc" Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.563501 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="43c48c19-789b-4324-9f62-e7d421eb8595" containerName="oc" Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.564884 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.586641 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4z7sl"] Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.674725 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d6jp\" (UniqueName: \"kubernetes.io/projected/933196b1-39bc-44f3-a2f0-c70e79efa389-kube-api-access-7d6jp\") pod \"community-operators-4z7sl\" (UID: \"933196b1-39bc-44f3-a2f0-c70e79efa389\") " pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.674830 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/933196b1-39bc-44f3-a2f0-c70e79efa389-utilities\") pod \"community-operators-4z7sl\" (UID: \"933196b1-39bc-44f3-a2f0-c70e79efa389\") " pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.675061 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/933196b1-39bc-44f3-a2f0-c70e79efa389-catalog-content\") pod \"community-operators-4z7sl\" (UID: \"933196b1-39bc-44f3-a2f0-c70e79efa389\") " pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.777570 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d6jp\" (UniqueName: \"kubernetes.io/projected/933196b1-39bc-44f3-a2f0-c70e79efa389-kube-api-access-7d6jp\") pod \"community-operators-4z7sl\" (UID: \"933196b1-39bc-44f3-a2f0-c70e79efa389\") " pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.777683 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/933196b1-39bc-44f3-a2f0-c70e79efa389-utilities\") pod \"community-operators-4z7sl\" (UID: \"933196b1-39bc-44f3-a2f0-c70e79efa389\") " pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.777794 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/933196b1-39bc-44f3-a2f0-c70e79efa389-catalog-content\") pod \"community-operators-4z7sl\" (UID: \"933196b1-39bc-44f3-a2f0-c70e79efa389\") " pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.778332 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/933196b1-39bc-44f3-a2f0-c70e79efa389-catalog-content\") pod \"community-operators-4z7sl\" (UID: \"933196b1-39bc-44f3-a2f0-c70e79efa389\") " pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.778897 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/933196b1-39bc-44f3-a2f0-c70e79efa389-utilities\") pod \"community-operators-4z7sl\" (UID: \"933196b1-39bc-44f3-a2f0-c70e79efa389\") " pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.806962 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d6jp\" (UniqueName: \"kubernetes.io/projected/933196b1-39bc-44f3-a2f0-c70e79efa389-kube-api-access-7d6jp\") pod \"community-operators-4z7sl\" (UID: \"933196b1-39bc-44f3-a2f0-c70e79efa389\") " pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:36:52 crc kubenswrapper[4937]: I0225 16:36:52.884219 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:36:53 crc kubenswrapper[4937]: I0225 16:36:53.346160 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4z7sl"] Feb 25 16:36:53 crc kubenswrapper[4937]: I0225 16:36:53.874722 4937 generic.go:334] "Generic (PLEG): container finished" podID="933196b1-39bc-44f3-a2f0-c70e79efa389" containerID="2ad33f59606ea845e275369599798367e360c13ba0f4f34640f326c9e31b1b59" exitCode=0 Feb 25 16:36:53 crc kubenswrapper[4937]: I0225 16:36:53.874773 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4z7sl" event={"ID":"933196b1-39bc-44f3-a2f0-c70e79efa389","Type":"ContainerDied","Data":"2ad33f59606ea845e275369599798367e360c13ba0f4f34640f326c9e31b1b59"} Feb 25 16:36:53 crc kubenswrapper[4937]: I0225 16:36:53.874995 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4z7sl" event={"ID":"933196b1-39bc-44f3-a2f0-c70e79efa389","Type":"ContainerStarted","Data":"3adacd409e1ee18a9a26874cfee144b088707743d0ef0d9bac1b418a9c7a8b4f"} Feb 25 16:36:55 crc kubenswrapper[4937]: I0225 16:36:55.557857 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c4l5b"] Feb 25 16:36:55 crc kubenswrapper[4937]: I0225 16:36:55.561473 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:36:55 crc kubenswrapper[4937]: I0225 16:36:55.576183 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c4l5b"] Feb 25 16:36:55 crc kubenswrapper[4937]: I0225 16:36:55.745177 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53669da0-a436-4537-b09e-7dad7fc0686a-utilities\") pod \"certified-operators-c4l5b\" (UID: \"53669da0-a436-4537-b09e-7dad7fc0686a\") " pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:36:55 crc kubenswrapper[4937]: I0225 16:36:55.745356 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53669da0-a436-4537-b09e-7dad7fc0686a-catalog-content\") pod \"certified-operators-c4l5b\" (UID: \"53669da0-a436-4537-b09e-7dad7fc0686a\") " pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:36:55 crc kubenswrapper[4937]: I0225 16:36:55.745519 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjpkc\" (UniqueName: \"kubernetes.io/projected/53669da0-a436-4537-b09e-7dad7fc0686a-kube-api-access-hjpkc\") pod \"certified-operators-c4l5b\" (UID: \"53669da0-a436-4537-b09e-7dad7fc0686a\") " pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:36:55 crc kubenswrapper[4937]: I0225 16:36:55.847443 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53669da0-a436-4537-b09e-7dad7fc0686a-catalog-content\") pod \"certified-operators-c4l5b\" (UID: \"53669da0-a436-4537-b09e-7dad7fc0686a\") " pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:36:55 crc kubenswrapper[4937]: I0225 16:36:55.847554 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjpkc\" (UniqueName: \"kubernetes.io/projected/53669da0-a436-4537-b09e-7dad7fc0686a-kube-api-access-hjpkc\") pod \"certified-operators-c4l5b\" (UID: \"53669da0-a436-4537-b09e-7dad7fc0686a\") " pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:36:55 crc kubenswrapper[4937]: I0225 16:36:55.847714 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53669da0-a436-4537-b09e-7dad7fc0686a-utilities\") pod \"certified-operators-c4l5b\" (UID: \"53669da0-a436-4537-b09e-7dad7fc0686a\") " pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:36:55 crc kubenswrapper[4937]: I0225 16:36:55.848300 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53669da0-a436-4537-b09e-7dad7fc0686a-catalog-content\") pod \"certified-operators-c4l5b\" (UID: \"53669da0-a436-4537-b09e-7dad7fc0686a\") " pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:36:55 crc kubenswrapper[4937]: I0225 16:36:55.848773 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53669da0-a436-4537-b09e-7dad7fc0686a-utilities\") pod \"certified-operators-c4l5b\" (UID: \"53669da0-a436-4537-b09e-7dad7fc0686a\") " pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:36:55 crc kubenswrapper[4937]: I0225 16:36:55.870176 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjpkc\" (UniqueName: \"kubernetes.io/projected/53669da0-a436-4537-b09e-7dad7fc0686a-kube-api-access-hjpkc\") pod \"certified-operators-c4l5b\" (UID: \"53669da0-a436-4537-b09e-7dad7fc0686a\") " pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:36:55 crc kubenswrapper[4937]: I0225 16:36:55.886222 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:36:55 crc kubenswrapper[4937]: I0225 16:36:55.895968 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4z7sl" event={"ID":"933196b1-39bc-44f3-a2f0-c70e79efa389","Type":"ContainerStarted","Data":"8d3436f4cbc2472c66b3430e8780ccc36ca3fd1b38130243edbf98de2047110a"} Feb 25 16:36:56 crc kubenswrapper[4937]: I0225 16:36:56.399103 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c4l5b"] Feb 25 16:36:56 crc kubenswrapper[4937]: W0225 16:36:56.421145 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53669da0_a436_4537_b09e_7dad7fc0686a.slice/crio-9c098c6e726d4e170154787ac257b94d7afd30a6cdab4a4d3abb6683fe96b716 WatchSource:0}: Error finding container 9c098c6e726d4e170154787ac257b94d7afd30a6cdab4a4d3abb6683fe96b716: Status 404 returned error can't find the container with id 9c098c6e726d4e170154787ac257b94d7afd30a6cdab4a4d3abb6683fe96b716 Feb 25 16:36:56 crc kubenswrapper[4937]: I0225 16:36:56.908766 4937 generic.go:334] "Generic (PLEG): container finished" podID="53669da0-a436-4537-b09e-7dad7fc0686a" containerID="459fcd662b37023bb2e6bb1a44933fe23da955b013ad9aeb2c4a4177ede0fbe1" exitCode=0 Feb 25 16:36:56 crc kubenswrapper[4937]: I0225 16:36:56.908809 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4l5b" event={"ID":"53669da0-a436-4537-b09e-7dad7fc0686a","Type":"ContainerDied","Data":"459fcd662b37023bb2e6bb1a44933fe23da955b013ad9aeb2c4a4177ede0fbe1"} Feb 25 16:36:56 crc kubenswrapper[4937]: I0225 16:36:56.908847 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4l5b" event={"ID":"53669da0-a436-4537-b09e-7dad7fc0686a","Type":"ContainerStarted","Data":"9c098c6e726d4e170154787ac257b94d7afd30a6cdab4a4d3abb6683fe96b716"} Feb 25 16:36:56 crc kubenswrapper[4937]: I0225 16:36:56.910605 4937 generic.go:334] "Generic (PLEG): container finished" podID="933196b1-39bc-44f3-a2f0-c70e79efa389" containerID="8d3436f4cbc2472c66b3430e8780ccc36ca3fd1b38130243edbf98de2047110a" exitCode=0 Feb 25 16:36:56 crc kubenswrapper[4937]: I0225 16:36:56.910652 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4z7sl" event={"ID":"933196b1-39bc-44f3-a2f0-c70e79efa389","Type":"ContainerDied","Data":"8d3436f4cbc2472c66b3430e8780ccc36ca3fd1b38130243edbf98de2047110a"} Feb 25 16:36:57 crc kubenswrapper[4937]: I0225 16:36:57.921157 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4l5b" event={"ID":"53669da0-a436-4537-b09e-7dad7fc0686a","Type":"ContainerStarted","Data":"5a7b1e825b9e3ebaaf1a14dc86c3582c81dbfc3ca05a7c105e1a5e6653fd2d0f"} Feb 25 16:36:57 crc kubenswrapper[4937]: I0225 16:36:57.925584 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4z7sl" event={"ID":"933196b1-39bc-44f3-a2f0-c70e79efa389","Type":"ContainerStarted","Data":"05a30be753ea32d23d9aa4844be26a63b9e90d656353b85f05681da2811c1a29"} Feb 25 16:36:57 crc kubenswrapper[4937]: I0225 16:36:57.988547 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4z7sl" podStartSLOduration=2.492817808 podStartE2EDuration="5.988522502s" podCreationTimestamp="2026-02-25 16:36:52 +0000 UTC" firstStartedPulling="2026-02-25 16:36:53.877398214 +0000 UTC m=+3064.890790104" lastFinishedPulling="2026-02-25 16:36:57.373102918 +0000 UTC m=+3068.386494798" observedRunningTime="2026-02-25 16:36:57.982654705 +0000 UTC m=+3068.996046595" watchObservedRunningTime="2026-02-25 16:36:57.988522502 +0000 UTC m=+3069.001914392" Feb 25 16:36:59 crc kubenswrapper[4937]: I0225 16:36:59.948432 4937 generic.go:334] "Generic (PLEG): container finished" podID="53669da0-a436-4537-b09e-7dad7fc0686a" containerID="5a7b1e825b9e3ebaaf1a14dc86c3582c81dbfc3ca05a7c105e1a5e6653fd2d0f" exitCode=0 Feb 25 16:36:59 crc kubenswrapper[4937]: I0225 16:36:59.948512 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4l5b" event={"ID":"53669da0-a436-4537-b09e-7dad7fc0686a","Type":"ContainerDied","Data":"5a7b1e825b9e3ebaaf1a14dc86c3582c81dbfc3ca05a7c105e1a5e6653fd2d0f"} Feb 25 16:37:00 crc kubenswrapper[4937]: I0225 16:37:00.959986 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4l5b" event={"ID":"53669da0-a436-4537-b09e-7dad7fc0686a","Type":"ContainerStarted","Data":"5174160a005965f41286b685fa74e222730833bffd8d60f9f26806e84cd2375c"} Feb 25 16:37:01 crc kubenswrapper[4937]: I0225 16:37:01.003409 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c4l5b" podStartSLOduration=2.55376512 podStartE2EDuration="6.003384561s" podCreationTimestamp="2026-02-25 16:36:55 +0000 UTC" firstStartedPulling="2026-02-25 16:36:56.912359726 +0000 UTC m=+3067.925751616" lastFinishedPulling="2026-02-25 16:37:00.361979137 +0000 UTC m=+3071.375371057" observedRunningTime="2026-02-25 16:37:00.984375425 +0000 UTC m=+3071.997767315" watchObservedRunningTime="2026-02-25 16:37:01.003384561 +0000 UTC m=+3072.016776491" Feb 25 16:37:02 crc kubenswrapper[4937]: I0225 16:37:02.885636 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:37:02 crc kubenswrapper[4937]: I0225 16:37:02.887263 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:37:02 crc kubenswrapper[4937]: I0225 16:37:02.959220 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:37:03 crc kubenswrapper[4937]: I0225 16:37:03.042335 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:37:04 crc kubenswrapper[4937]: I0225 16:37:04.148167 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4z7sl"] Feb 25 16:37:05 crc kubenswrapper[4937]: I0225 16:37:05.886957 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:37:05 crc kubenswrapper[4937]: I0225 16:37:05.887017 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:37:05 crc kubenswrapper[4937]: I0225 16:37:05.940194 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:37:06 crc kubenswrapper[4937]: I0225 16:37:06.012588 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4z7sl" podUID="933196b1-39bc-44f3-a2f0-c70e79efa389" containerName="registry-server" containerID="cri-o://05a30be753ea32d23d9aa4844be26a63b9e90d656353b85f05681da2811c1a29" gracePeriod=2 Feb 25 16:37:06 crc kubenswrapper[4937]: I0225 16:37:06.060800 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:37:06 crc kubenswrapper[4937]: I0225 16:37:06.568508 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:37:06 crc kubenswrapper[4937]: I0225 16:37:06.620308 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/933196b1-39bc-44f3-a2f0-c70e79efa389-catalog-content\") pod \"933196b1-39bc-44f3-a2f0-c70e79efa389\" (UID: \"933196b1-39bc-44f3-a2f0-c70e79efa389\") " Feb 25 16:37:06 crc kubenswrapper[4937]: I0225 16:37:06.620563 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d6jp\" (UniqueName: \"kubernetes.io/projected/933196b1-39bc-44f3-a2f0-c70e79efa389-kube-api-access-7d6jp\") pod \"933196b1-39bc-44f3-a2f0-c70e79efa389\" (UID: \"933196b1-39bc-44f3-a2f0-c70e79efa389\") " Feb 25 16:37:06 crc kubenswrapper[4937]: I0225 16:37:06.632852 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/933196b1-39bc-44f3-a2f0-c70e79efa389-kube-api-access-7d6jp" (OuterVolumeSpecName: "kube-api-access-7d6jp") pod "933196b1-39bc-44f3-a2f0-c70e79efa389" (UID: "933196b1-39bc-44f3-a2f0-c70e79efa389"). InnerVolumeSpecName "kube-api-access-7d6jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:37:06 crc kubenswrapper[4937]: I0225 16:37:06.690886 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/933196b1-39bc-44f3-a2f0-c70e79efa389-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "933196b1-39bc-44f3-a2f0-c70e79efa389" (UID: "933196b1-39bc-44f3-a2f0-c70e79efa389"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:37:06 crc kubenswrapper[4937]: I0225 16:37:06.724042 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/933196b1-39bc-44f3-a2f0-c70e79efa389-utilities\") pod \"933196b1-39bc-44f3-a2f0-c70e79efa389\" (UID: \"933196b1-39bc-44f3-a2f0-c70e79efa389\") " Feb 25 16:37:06 crc kubenswrapper[4937]: I0225 16:37:06.724836 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/933196b1-39bc-44f3-a2f0-c70e79efa389-utilities" (OuterVolumeSpecName: "utilities") pod "933196b1-39bc-44f3-a2f0-c70e79efa389" (UID: "933196b1-39bc-44f3-a2f0-c70e79efa389"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:37:06 crc kubenswrapper[4937]: I0225 16:37:06.724955 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d6jp\" (UniqueName: \"kubernetes.io/projected/933196b1-39bc-44f3-a2f0-c70e79efa389-kube-api-access-7d6jp\") on node \"crc\" DevicePath \"\"" Feb 25 16:37:06 crc kubenswrapper[4937]: I0225 16:37:06.724977 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/933196b1-39bc-44f3-a2f0-c70e79efa389-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:37:06 crc kubenswrapper[4937]: I0225 16:37:06.724993 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/933196b1-39bc-44f3-a2f0-c70e79efa389-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.021903 4937 generic.go:334] "Generic (PLEG): container finished" podID="933196b1-39bc-44f3-a2f0-c70e79efa389" containerID="05a30be753ea32d23d9aa4844be26a63b9e90d656353b85f05681da2811c1a29" exitCode=0 Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.021955 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4z7sl" Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.021947 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4z7sl" event={"ID":"933196b1-39bc-44f3-a2f0-c70e79efa389","Type":"ContainerDied","Data":"05a30be753ea32d23d9aa4844be26a63b9e90d656353b85f05681da2811c1a29"} Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.022003 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4z7sl" event={"ID":"933196b1-39bc-44f3-a2f0-c70e79efa389","Type":"ContainerDied","Data":"3adacd409e1ee18a9a26874cfee144b088707743d0ef0d9bac1b418a9c7a8b4f"} Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.022021 4937 scope.go:117] "RemoveContainer" containerID="05a30be753ea32d23d9aa4844be26a63b9e90d656353b85f05681da2811c1a29" Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.040829 4937 scope.go:117] "RemoveContainer" containerID="8d3436f4cbc2472c66b3430e8780ccc36ca3fd1b38130243edbf98de2047110a" Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.060043 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4z7sl"] Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.068119 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4z7sl"] Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.076356 4937 scope.go:117] "RemoveContainer" containerID="2ad33f59606ea845e275369599798367e360c13ba0f4f34640f326c9e31b1b59" Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.124170 4937 scope.go:117] "RemoveContainer" containerID="05a30be753ea32d23d9aa4844be26a63b9e90d656353b85f05681da2811c1a29" Feb 25 16:37:07 crc kubenswrapper[4937]: E0225 16:37:07.124762 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05a30be753ea32d23d9aa4844be26a63b9e90d656353b85f05681da2811c1a29\": container with ID starting with 05a30be753ea32d23d9aa4844be26a63b9e90d656353b85f05681da2811c1a29 not found: ID does not exist" containerID="05a30be753ea32d23d9aa4844be26a63b9e90d656353b85f05681da2811c1a29" Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.124809 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a30be753ea32d23d9aa4844be26a63b9e90d656353b85f05681da2811c1a29"} err="failed to get container status \"05a30be753ea32d23d9aa4844be26a63b9e90d656353b85f05681da2811c1a29\": rpc error: code = NotFound desc = could not find container \"05a30be753ea32d23d9aa4844be26a63b9e90d656353b85f05681da2811c1a29\": container with ID starting with 05a30be753ea32d23d9aa4844be26a63b9e90d656353b85f05681da2811c1a29 not found: ID does not exist" Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.124842 4937 scope.go:117] "RemoveContainer" containerID="8d3436f4cbc2472c66b3430e8780ccc36ca3fd1b38130243edbf98de2047110a" Feb 25 16:37:07 crc kubenswrapper[4937]: E0225 16:37:07.125272 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d3436f4cbc2472c66b3430e8780ccc36ca3fd1b38130243edbf98de2047110a\": container with ID starting with 8d3436f4cbc2472c66b3430e8780ccc36ca3fd1b38130243edbf98de2047110a not found: ID does not exist" containerID="8d3436f4cbc2472c66b3430e8780ccc36ca3fd1b38130243edbf98de2047110a" Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.125302 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d3436f4cbc2472c66b3430e8780ccc36ca3fd1b38130243edbf98de2047110a"} err="failed to get container status \"8d3436f4cbc2472c66b3430e8780ccc36ca3fd1b38130243edbf98de2047110a\": rpc error: code = NotFound desc = could not find container \"8d3436f4cbc2472c66b3430e8780ccc36ca3fd1b38130243edbf98de2047110a\": container with ID starting with 8d3436f4cbc2472c66b3430e8780ccc36ca3fd1b38130243edbf98de2047110a not found: ID does not exist" Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.125321 4937 scope.go:117] "RemoveContainer" containerID="2ad33f59606ea845e275369599798367e360c13ba0f4f34640f326c9e31b1b59" Feb 25 16:37:07 crc kubenswrapper[4937]: E0225 16:37:07.125656 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ad33f59606ea845e275369599798367e360c13ba0f4f34640f326c9e31b1b59\": container with ID starting with 2ad33f59606ea845e275369599798367e360c13ba0f4f34640f326c9e31b1b59 not found: ID does not exist" containerID="2ad33f59606ea845e275369599798367e360c13ba0f4f34640f326c9e31b1b59" Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.125713 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad33f59606ea845e275369599798367e360c13ba0f4f34640f326c9e31b1b59"} err="failed to get container status \"2ad33f59606ea845e275369599798367e360c13ba0f4f34640f326c9e31b1b59\": rpc error: code = NotFound desc = could not find container \"2ad33f59606ea845e275369599798367e360c13ba0f4f34640f326c9e31b1b59\": container with ID starting with 2ad33f59606ea845e275369599798367e360c13ba0f4f34640f326c9e31b1b59 not found: ID does not exist" Feb 25 16:37:07 crc kubenswrapper[4937]: I0225 16:37:07.385284 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="933196b1-39bc-44f3-a2f0-c70e79efa389" path="/var/lib/kubelet/pods/933196b1-39bc-44f3-a2f0-c70e79efa389/volumes" Feb 25 16:37:08 crc kubenswrapper[4937]: I0225 16:37:08.340692 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c4l5b"] Feb 25 16:37:08 crc kubenswrapper[4937]: I0225 16:37:08.341282 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c4l5b" podUID="53669da0-a436-4537-b09e-7dad7fc0686a" containerName="registry-server" containerID="cri-o://5174160a005965f41286b685fa74e222730833bffd8d60f9f26806e84cd2375c" gracePeriod=2 Feb 25 16:37:08 crc kubenswrapper[4937]: I0225 16:37:08.847539 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:37:08 crc kubenswrapper[4937]: I0225 16:37:08.969642 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53669da0-a436-4537-b09e-7dad7fc0686a-catalog-content\") pod \"53669da0-a436-4537-b09e-7dad7fc0686a\" (UID: \"53669da0-a436-4537-b09e-7dad7fc0686a\") " Feb 25 16:37:08 crc kubenswrapper[4937]: I0225 16:37:08.969978 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53669da0-a436-4537-b09e-7dad7fc0686a-utilities\") pod \"53669da0-a436-4537-b09e-7dad7fc0686a\" (UID: \"53669da0-a436-4537-b09e-7dad7fc0686a\") " Feb 25 16:37:08 crc kubenswrapper[4937]: I0225 16:37:08.970043 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjpkc\" (UniqueName: \"kubernetes.io/projected/53669da0-a436-4537-b09e-7dad7fc0686a-kube-api-access-hjpkc\") pod \"53669da0-a436-4537-b09e-7dad7fc0686a\" (UID: \"53669da0-a436-4537-b09e-7dad7fc0686a\") " Feb 25 16:37:08 crc kubenswrapper[4937]: I0225 16:37:08.970963 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53669da0-a436-4537-b09e-7dad7fc0686a-utilities" (OuterVolumeSpecName: "utilities") pod "53669da0-a436-4537-b09e-7dad7fc0686a" (UID: "53669da0-a436-4537-b09e-7dad7fc0686a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:37:08 crc kubenswrapper[4937]: I0225 16:37:08.976432 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53669da0-a436-4537-b09e-7dad7fc0686a-kube-api-access-hjpkc" (OuterVolumeSpecName: "kube-api-access-hjpkc") pod "53669da0-a436-4537-b09e-7dad7fc0686a" (UID: "53669da0-a436-4537-b09e-7dad7fc0686a"). InnerVolumeSpecName "kube-api-access-hjpkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.022289 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53669da0-a436-4537-b09e-7dad7fc0686a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53669da0-a436-4537-b09e-7dad7fc0686a" (UID: "53669da0-a436-4537-b09e-7dad7fc0686a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.048549 4937 generic.go:334] "Generic (PLEG): container finished" podID="53669da0-a436-4537-b09e-7dad7fc0686a" containerID="5174160a005965f41286b685fa74e222730833bffd8d60f9f26806e84cd2375c" exitCode=0 Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.048596 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4l5b" event={"ID":"53669da0-a436-4537-b09e-7dad7fc0686a","Type":"ContainerDied","Data":"5174160a005965f41286b685fa74e222730833bffd8d60f9f26806e84cd2375c"} Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.048627 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4l5b" event={"ID":"53669da0-a436-4537-b09e-7dad7fc0686a","Type":"ContainerDied","Data":"9c098c6e726d4e170154787ac257b94d7afd30a6cdab4a4d3abb6683fe96b716"} Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.048647 4937 scope.go:117] "RemoveContainer" containerID="5174160a005965f41286b685fa74e222730833bffd8d60f9f26806e84cd2375c" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.048682 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4l5b" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.073469 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53669da0-a436-4537-b09e-7dad7fc0686a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.073528 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53669da0-a436-4537-b09e-7dad7fc0686a-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.073544 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjpkc\" (UniqueName: \"kubernetes.io/projected/53669da0-a436-4537-b09e-7dad7fc0686a-kube-api-access-hjpkc\") on node \"crc\" DevicePath \"\"" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.092256 4937 scope.go:117] "RemoveContainer" containerID="5a7b1e825b9e3ebaaf1a14dc86c3582c81dbfc3ca05a7c105e1a5e6653fd2d0f" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.108148 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c4l5b"] Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.134785 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c4l5b"] Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.151836 4937 scope.go:117] "RemoveContainer" containerID="459fcd662b37023bb2e6bb1a44933fe23da955b013ad9aeb2c4a4177ede0fbe1" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.175259 4937 scope.go:117] "RemoveContainer" containerID="5174160a005965f41286b685fa74e222730833bffd8d60f9f26806e84cd2375c" Feb 25 16:37:09 crc kubenswrapper[4937]: E0225 16:37:09.175746 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5174160a005965f41286b685fa74e222730833bffd8d60f9f26806e84cd2375c\": container with ID starting with 5174160a005965f41286b685fa74e222730833bffd8d60f9f26806e84cd2375c not found: ID does not exist" containerID="5174160a005965f41286b685fa74e222730833bffd8d60f9f26806e84cd2375c" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.175782 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5174160a005965f41286b685fa74e222730833bffd8d60f9f26806e84cd2375c"} err="failed to get container status \"5174160a005965f41286b685fa74e222730833bffd8d60f9f26806e84cd2375c\": rpc error: code = NotFound desc = could not find container \"5174160a005965f41286b685fa74e222730833bffd8d60f9f26806e84cd2375c\": container with ID starting with 5174160a005965f41286b685fa74e222730833bffd8d60f9f26806e84cd2375c not found: ID does not exist" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.175808 4937 scope.go:117] "RemoveContainer" containerID="5a7b1e825b9e3ebaaf1a14dc86c3582c81dbfc3ca05a7c105e1a5e6653fd2d0f" Feb 25 16:37:09 crc kubenswrapper[4937]: E0225 16:37:09.176228 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a7b1e825b9e3ebaaf1a14dc86c3582c81dbfc3ca05a7c105e1a5e6653fd2d0f\": container with ID starting with 5a7b1e825b9e3ebaaf1a14dc86c3582c81dbfc3ca05a7c105e1a5e6653fd2d0f not found: ID does not exist" containerID="5a7b1e825b9e3ebaaf1a14dc86c3582c81dbfc3ca05a7c105e1a5e6653fd2d0f" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.176265 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7b1e825b9e3ebaaf1a14dc86c3582c81dbfc3ca05a7c105e1a5e6653fd2d0f"} err="failed to get container status \"5a7b1e825b9e3ebaaf1a14dc86c3582c81dbfc3ca05a7c105e1a5e6653fd2d0f\": rpc error: code = NotFound desc = could not find container \"5a7b1e825b9e3ebaaf1a14dc86c3582c81dbfc3ca05a7c105e1a5e6653fd2d0f\": container with ID starting with 5a7b1e825b9e3ebaaf1a14dc86c3582c81dbfc3ca05a7c105e1a5e6653fd2d0f not found: ID does not exist" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.176285 4937 scope.go:117] "RemoveContainer" containerID="459fcd662b37023bb2e6bb1a44933fe23da955b013ad9aeb2c4a4177ede0fbe1" Feb 25 16:37:09 crc kubenswrapper[4937]: E0225 16:37:09.176540 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"459fcd662b37023bb2e6bb1a44933fe23da955b013ad9aeb2c4a4177ede0fbe1\": container with ID starting with 459fcd662b37023bb2e6bb1a44933fe23da955b013ad9aeb2c4a4177ede0fbe1 not found: ID does not exist" containerID="459fcd662b37023bb2e6bb1a44933fe23da955b013ad9aeb2c4a4177ede0fbe1" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.176569 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"459fcd662b37023bb2e6bb1a44933fe23da955b013ad9aeb2c4a4177ede0fbe1"} err="failed to get container status \"459fcd662b37023bb2e6bb1a44933fe23da955b013ad9aeb2c4a4177ede0fbe1\": rpc error: code = NotFound desc = could not find container \"459fcd662b37023bb2e6bb1a44933fe23da955b013ad9aeb2c4a4177ede0fbe1\": container with ID starting with 459fcd662b37023bb2e6bb1a44933fe23da955b013ad9aeb2c4a4177ede0fbe1 not found: ID does not exist" Feb 25 16:37:09 crc kubenswrapper[4937]: I0225 16:37:09.389349 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53669da0-a436-4537-b09e-7dad7fc0686a" path="/var/lib/kubelet/pods/53669da0-a436-4537-b09e-7dad7fc0686a/volumes" Feb 25 16:37:11 crc kubenswrapper[4937]: I0225 16:37:11.495312 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:37:11 crc kubenswrapper[4937]: I0225 16:37:11.495776 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:37:11 crc kubenswrapper[4937]: I0225 16:37:11.495859 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 16:37:11 crc kubenswrapper[4937]: I0225 16:37:11.497430 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 16:37:11 crc kubenswrapper[4937]: I0225 16:37:11.497578 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" gracePeriod=600 Feb 25 16:37:11 crc kubenswrapper[4937]: E0225 16:37:11.635831 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:37:12 crc kubenswrapper[4937]: I0225 16:37:12.093808 4937 generic.go:334] "Generic (PLEG): container finished" podID="a6ef0688-25f8-4018-8976-30334bf11136" containerID="76a17dda0b2009c6cfbffd3130900c5cb3c317881ac20303ea968d0239b4ec65" exitCode=0 Feb 25 16:37:12 crc kubenswrapper[4937]: I0225 16:37:12.093835 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" event={"ID":"a6ef0688-25f8-4018-8976-30334bf11136","Type":"ContainerDied","Data":"76a17dda0b2009c6cfbffd3130900c5cb3c317881ac20303ea968d0239b4ec65"} Feb 25 16:37:12 crc kubenswrapper[4937]: I0225 16:37:12.098926 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" exitCode=0 Feb 25 16:37:12 crc kubenswrapper[4937]: I0225 16:37:12.099005 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d"} Feb 25 16:37:12 crc kubenswrapper[4937]: I0225 16:37:12.099431 4937 scope.go:117] "RemoveContainer" containerID="14bc3b494b958d91b2a8bca2f2d6a7e4ab3c7c93997b6a4481ec5bf58785334b" Feb 25 16:37:12 crc kubenswrapper[4937]: I0225 16:37:12.100412 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:37:12 crc kubenswrapper[4937]: E0225 16:37:12.101037 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.680802 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.788319 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-inventory\") pod \"a6ef0688-25f8-4018-8976-30334bf11136\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.788654 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-telemetry-combined-ca-bundle\") pod \"a6ef0688-25f8-4018-8976-30334bf11136\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.788715 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snq9q\" (UniqueName: \"kubernetes.io/projected/a6ef0688-25f8-4018-8976-30334bf11136-kube-api-access-snq9q\") pod \"a6ef0688-25f8-4018-8976-30334bf11136\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.788776 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ssh-key-openstack-edpm-ipam\") pod \"a6ef0688-25f8-4018-8976-30334bf11136\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.788903 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-1\") pod \"a6ef0688-25f8-4018-8976-30334bf11136\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.788931 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-0\") pod \"a6ef0688-25f8-4018-8976-30334bf11136\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.788954 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-2\") pod \"a6ef0688-25f8-4018-8976-30334bf11136\" (UID: \"a6ef0688-25f8-4018-8976-30334bf11136\") " Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.796889 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ef0688-25f8-4018-8976-30334bf11136-kube-api-access-snq9q" (OuterVolumeSpecName: "kube-api-access-snq9q") pod "a6ef0688-25f8-4018-8976-30334bf11136" (UID: "a6ef0688-25f8-4018-8976-30334bf11136"). InnerVolumeSpecName "kube-api-access-snq9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.812972 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a6ef0688-25f8-4018-8976-30334bf11136" (UID: "a6ef0688-25f8-4018-8976-30334bf11136"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.831323 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a6ef0688-25f8-4018-8976-30334bf11136" (UID: "a6ef0688-25f8-4018-8976-30334bf11136"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.833869 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-inventory" (OuterVolumeSpecName: "inventory") pod "a6ef0688-25f8-4018-8976-30334bf11136" (UID: "a6ef0688-25f8-4018-8976-30334bf11136"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.837430 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a6ef0688-25f8-4018-8976-30334bf11136" (UID: "a6ef0688-25f8-4018-8976-30334bf11136"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.845952 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a6ef0688-25f8-4018-8976-30334bf11136" (UID: "a6ef0688-25f8-4018-8976-30334bf11136"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.852530 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a6ef0688-25f8-4018-8976-30334bf11136" (UID: "a6ef0688-25f8-4018-8976-30334bf11136"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.891281 4937 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.891312 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snq9q\" (UniqueName: \"kubernetes.io/projected/a6ef0688-25f8-4018-8976-30334bf11136-kube-api-access-snq9q\") on node \"crc\" DevicePath \"\"" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.891325 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.891342 4937 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.891356 4937 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.891369 4937 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 25 16:37:13 crc kubenswrapper[4937]: I0225 16:37:13.891382 4937 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6ef0688-25f8-4018-8976-30334bf11136-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 16:37:14 crc kubenswrapper[4937]: I0225 16:37:14.127455 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" event={"ID":"a6ef0688-25f8-4018-8976-30334bf11136","Type":"ContainerDied","Data":"161772e560b6805171d92ff8b88c56c4332ecb2415f1e0653e4baf7c6f93b2ee"} Feb 25 16:37:14 crc kubenswrapper[4937]: I0225 16:37:14.127937 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="161772e560b6805171d92ff8b88c56c4332ecb2415f1e0653e4baf7c6f93b2ee" Feb 25 16:37:14 crc kubenswrapper[4937]: I0225 16:37:14.127699 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g87qr" Feb 25 16:37:25 crc kubenswrapper[4937]: I0225 16:37:25.368760 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:37:25 crc kubenswrapper[4937]: E0225 16:37:25.371148 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:37:36 crc kubenswrapper[4937]: I0225 16:37:36.368754 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:37:36 crc kubenswrapper[4937]: E0225 16:37:36.369718 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:37:47 crc kubenswrapper[4937]: I0225 16:37:47.368323 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:37:47 crc kubenswrapper[4937]: E0225 16:37:47.369277 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.276759 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 25 16:37:56 crc kubenswrapper[4937]: E0225 16:37:56.277800 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933196b1-39bc-44f3-a2f0-c70e79efa389" containerName="registry-server" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.277818 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="933196b1-39bc-44f3-a2f0-c70e79efa389" containerName="registry-server" Feb 25 16:37:56 crc kubenswrapper[4937]: E0225 16:37:56.277834 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ef0688-25f8-4018-8976-30334bf11136" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.277845 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ef0688-25f8-4018-8976-30334bf11136" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 25 16:37:56 crc kubenswrapper[4937]: E0225 16:37:56.277863 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53669da0-a436-4537-b09e-7dad7fc0686a" containerName="extract-content" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.277871 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="53669da0-a436-4537-b09e-7dad7fc0686a" containerName="extract-content" Feb 25 16:37:56 crc kubenswrapper[4937]: E0225 16:37:56.277884 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933196b1-39bc-44f3-a2f0-c70e79efa389" containerName="extract-content" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.277891 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="933196b1-39bc-44f3-a2f0-c70e79efa389" containerName="extract-content" Feb 25 16:37:56 crc kubenswrapper[4937]: E0225 16:37:56.277911 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53669da0-a436-4537-b09e-7dad7fc0686a" containerName="registry-server" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.277917 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="53669da0-a436-4537-b09e-7dad7fc0686a" containerName="registry-server" Feb 25 16:37:56 crc kubenswrapper[4937]: E0225 16:37:56.277937 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53669da0-a436-4537-b09e-7dad7fc0686a" containerName="extract-utilities" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.277943 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="53669da0-a436-4537-b09e-7dad7fc0686a" containerName="extract-utilities" Feb 25 16:37:56 crc kubenswrapper[4937]: E0225 16:37:56.277956 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933196b1-39bc-44f3-a2f0-c70e79efa389" containerName="extract-utilities" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.277962 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="933196b1-39bc-44f3-a2f0-c70e79efa389" containerName="extract-utilities" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.278146 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="53669da0-a436-4537-b09e-7dad7fc0686a" containerName="registry-server" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.278163 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="933196b1-39bc-44f3-a2f0-c70e79efa389" containerName="registry-server" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.278179 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ef0688-25f8-4018-8976-30334bf11136" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.279023 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.285826 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8pfdg" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.286069 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.286217 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.286426 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.293563 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.403738 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/994bcbfb-8270-42b1-bc77-6a262f2d29e3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.403801 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/994bcbfb-8270-42b1-bc77-6a262f2d29e3-config-data\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.403837 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.404027 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.404136 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/994bcbfb-8270-42b1-bc77-6a262f2d29e3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.404171 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.404370 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/994bcbfb-8270-42b1-bc77-6a262f2d29e3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.404446 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xht6\" (UniqueName: \"kubernetes.io/projected/994bcbfb-8270-42b1-bc77-6a262f2d29e3-kube-api-access-4xht6\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.404612 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.507163 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/994bcbfb-8270-42b1-bc77-6a262f2d29e3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.507218 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xht6\" (UniqueName: \"kubernetes.io/projected/994bcbfb-8270-42b1-bc77-6a262f2d29e3-kube-api-access-4xht6\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.507262 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.507332 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/994bcbfb-8270-42b1-bc77-6a262f2d29e3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.507374 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/994bcbfb-8270-42b1-bc77-6a262f2d29e3-config-data\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.507424 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.507470 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.507602 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/994bcbfb-8270-42b1-bc77-6a262f2d29e3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.507634 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.507851 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/994bcbfb-8270-42b1-bc77-6a262f2d29e3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.508146 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/994bcbfb-8270-42b1-bc77-6a262f2d29e3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.508929 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/994bcbfb-8270-42b1-bc77-6a262f2d29e3-config-data\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.509081 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.509992 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/994bcbfb-8270-42b1-bc77-6a262f2d29e3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.514120 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.514228 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.515353 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.530264 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xht6\" (UniqueName: \"kubernetes.io/projected/994bcbfb-8270-42b1-bc77-6a262f2d29e3-kube-api-access-4xht6\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.556476 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " pod="openstack/tempest-tests-tempest" Feb 25 16:37:56 crc kubenswrapper[4937]: I0225 16:37:56.603894 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 25 16:37:57 crc kubenswrapper[4937]: W0225 16:37:57.071244 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod994bcbfb_8270_42b1_bc77_6a262f2d29e3.slice/crio-1a8387000143fa1af1b551b73d2f1c232d4c67c2ac9fe5487ac633d74f453730 WatchSource:0}: Error finding container 1a8387000143fa1af1b551b73d2f1c232d4c67c2ac9fe5487ac633d74f453730: Status 404 returned error can't find the container with id 1a8387000143fa1af1b551b73d2f1c232d4c67c2ac9fe5487ac633d74f453730 Feb 25 16:37:57 crc kubenswrapper[4937]: I0225 16:37:57.073858 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 25 16:37:57 crc kubenswrapper[4937]: I0225 16:37:57.624930 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"994bcbfb-8270-42b1-bc77-6a262f2d29e3","Type":"ContainerStarted","Data":"1a8387000143fa1af1b551b73d2f1c232d4c67c2ac9fe5487ac633d74f453730"} Feb 25 16:38:00 crc kubenswrapper[4937]: I0225 16:38:00.139046 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533958-rfpf9"] Feb 25 16:38:00 crc kubenswrapper[4937]: I0225 16:38:00.141118 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533958-rfpf9" Feb 25 16:38:00 crc kubenswrapper[4937]: I0225 16:38:00.144272 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:38:00 crc kubenswrapper[4937]: I0225 16:38:00.145412 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:38:00 crc kubenswrapper[4937]: I0225 16:38:00.149159 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533958-rfpf9"] Feb 25 16:38:00 crc kubenswrapper[4937]: I0225 16:38:00.151596 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:38:00 crc kubenswrapper[4937]: I0225 16:38:00.304226 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rkbd\" (UniqueName: \"kubernetes.io/projected/5dc53569-34e0-4131-9a94-5f40d0f4f247-kube-api-access-7rkbd\") pod \"auto-csr-approver-29533958-rfpf9\" (UID: \"5dc53569-34e0-4131-9a94-5f40d0f4f247\") " pod="openshift-infra/auto-csr-approver-29533958-rfpf9" Feb 25 16:38:00 crc kubenswrapper[4937]: I0225 16:38:00.368059 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:38:00 crc kubenswrapper[4937]: E0225 16:38:00.368599 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:38:00 crc kubenswrapper[4937]: I0225 16:38:00.413016 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rkbd\" (UniqueName: \"kubernetes.io/projected/5dc53569-34e0-4131-9a94-5f40d0f4f247-kube-api-access-7rkbd\") pod \"auto-csr-approver-29533958-rfpf9\" (UID: \"5dc53569-34e0-4131-9a94-5f40d0f4f247\") " pod="openshift-infra/auto-csr-approver-29533958-rfpf9" Feb 25 16:38:00 crc kubenswrapper[4937]: I0225 16:38:00.438470 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rkbd\" (UniqueName: \"kubernetes.io/projected/5dc53569-34e0-4131-9a94-5f40d0f4f247-kube-api-access-7rkbd\") pod \"auto-csr-approver-29533958-rfpf9\" (UID: \"5dc53569-34e0-4131-9a94-5f40d0f4f247\") " pod="openshift-infra/auto-csr-approver-29533958-rfpf9" Feb 25 16:38:00 crc kubenswrapper[4937]: I0225 16:38:00.472725 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533958-rfpf9" Feb 25 16:38:03 crc kubenswrapper[4937]: I0225 16:38:03.197022 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533958-rfpf9"] Feb 25 16:38:03 crc kubenswrapper[4937]: I0225 16:38:03.686715 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533958-rfpf9" event={"ID":"5dc53569-34e0-4131-9a94-5f40d0f4f247","Type":"ContainerStarted","Data":"e963aa647d977b03246e25920e6aa7caa0f79c5e3dab8cdfdf76600bdf29453c"} Feb 25 16:38:05 crc kubenswrapper[4937]: I0225 16:38:05.710321 4937 generic.go:334] "Generic (PLEG): container finished" podID="5dc53569-34e0-4131-9a94-5f40d0f4f247" containerID="c165af6627b531334b08b189d2ece7ac3f9f7afd546b1660022f3f11a71e108e" exitCode=0 Feb 25 16:38:05 crc kubenswrapper[4937]: I0225 16:38:05.710375 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533958-rfpf9" event={"ID":"5dc53569-34e0-4131-9a94-5f40d0f4f247","Type":"ContainerDied","Data":"c165af6627b531334b08b189d2ece7ac3f9f7afd546b1660022f3f11a71e108e"} Feb 25 16:38:07 crc kubenswrapper[4937]: I0225 16:38:07.203023 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533958-rfpf9" Feb 25 16:38:07 crc kubenswrapper[4937]: I0225 16:38:07.393316 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rkbd\" (UniqueName: \"kubernetes.io/projected/5dc53569-34e0-4131-9a94-5f40d0f4f247-kube-api-access-7rkbd\") pod \"5dc53569-34e0-4131-9a94-5f40d0f4f247\" (UID: \"5dc53569-34e0-4131-9a94-5f40d0f4f247\") " Feb 25 16:38:07 crc kubenswrapper[4937]: I0225 16:38:07.400014 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc53569-34e0-4131-9a94-5f40d0f4f247-kube-api-access-7rkbd" (OuterVolumeSpecName: "kube-api-access-7rkbd") pod "5dc53569-34e0-4131-9a94-5f40d0f4f247" (UID: "5dc53569-34e0-4131-9a94-5f40d0f4f247"). InnerVolumeSpecName "kube-api-access-7rkbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:38:07 crc kubenswrapper[4937]: I0225 16:38:07.504083 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rkbd\" (UniqueName: \"kubernetes.io/projected/5dc53569-34e0-4131-9a94-5f40d0f4f247-kube-api-access-7rkbd\") on node \"crc\" DevicePath \"\"" Feb 25 16:38:07 crc kubenswrapper[4937]: I0225 16:38:07.736876 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533958-rfpf9" event={"ID":"5dc53569-34e0-4131-9a94-5f40d0f4f247","Type":"ContainerDied","Data":"e963aa647d977b03246e25920e6aa7caa0f79c5e3dab8cdfdf76600bdf29453c"} Feb 25 16:38:07 crc kubenswrapper[4937]: I0225 16:38:07.737196 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e963aa647d977b03246e25920e6aa7caa0f79c5e3dab8cdfdf76600bdf29453c" Feb 25 16:38:07 crc kubenswrapper[4937]: I0225 16:38:07.736990 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533958-rfpf9" Feb 25 16:38:08 crc kubenswrapper[4937]: I0225 16:38:08.280691 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533952-fbfsk"] Feb 25 16:38:08 crc kubenswrapper[4937]: I0225 16:38:08.291827 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533952-fbfsk"] Feb 25 16:38:09 crc kubenswrapper[4937]: I0225 16:38:09.380978 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d355edc-ac55-4d29-9ece-deb38974fdbc" path="/var/lib/kubelet/pods/9d355edc-ac55-4d29-9ece-deb38974fdbc/volumes" Feb 25 16:38:14 crc kubenswrapper[4937]: I0225 16:38:14.367311 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:38:14 crc kubenswrapper[4937]: E0225 16:38:14.368102 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:38:29 crc kubenswrapper[4937]: I0225 16:38:29.367925 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:38:29 crc kubenswrapper[4937]: E0225 16:38:29.368772 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:38:34 crc kubenswrapper[4937]: E0225 16:38:34.276374 4937 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 25 16:38:34 crc kubenswrapper[4937]: E0225 16:38:34.277584 4937 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4xht6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(994bcbfb-8270-42b1-bc77-6a262f2d29e3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 16:38:34 crc kubenswrapper[4937]: E0225 16:38:34.278885 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="994bcbfb-8270-42b1-bc77-6a262f2d29e3" Feb 25 16:38:35 crc kubenswrapper[4937]: E0225 16:38:35.060350 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="994bcbfb-8270-42b1-bc77-6a262f2d29e3" Feb 25 16:38:42 crc kubenswrapper[4937]: I0225 16:38:42.163201 4937 scope.go:117] "RemoveContainer" containerID="120ad2dd64c86f8b0f145c84ff5da0c3a3649a29f6f0351dc0bf211404443128" Feb 25 16:38:44 crc kubenswrapper[4937]: I0225 16:38:44.367939 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:38:44 crc kubenswrapper[4937]: E0225 16:38:44.368692 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.055784 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wpgnj"] Feb 25 16:38:48 crc kubenswrapper[4937]: E0225 16:38:48.056722 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc53569-34e0-4131-9a94-5f40d0f4f247" containerName="oc" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.056734 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc53569-34e0-4131-9a94-5f40d0f4f247" containerName="oc" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.056942 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc53569-34e0-4131-9a94-5f40d0f4f247" containerName="oc" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.058615 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.068613 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpgnj"] Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.161824 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27721a82-c9aa-467b-a6d7-8161323b0989-utilities\") pod \"redhat-marketplace-wpgnj\" (UID: \"27721a82-c9aa-467b-a6d7-8161323b0989\") " pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.162039 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9fqp\" (UniqueName: \"kubernetes.io/projected/27721a82-c9aa-467b-a6d7-8161323b0989-kube-api-access-f9fqp\") pod \"redhat-marketplace-wpgnj\" (UID: \"27721a82-c9aa-467b-a6d7-8161323b0989\") " pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.162249 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27721a82-c9aa-467b-a6d7-8161323b0989-catalog-content\") pod \"redhat-marketplace-wpgnj\" (UID: \"27721a82-c9aa-467b-a6d7-8161323b0989\") " pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.210868 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"994bcbfb-8270-42b1-bc77-6a262f2d29e3","Type":"ContainerStarted","Data":"7de2c2135b975ec78bef7ded176cdec997b2237395b1a411360dcc097e1f0896"} Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.243356 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k4dkk"] Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.246534 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.254783 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.24357957 podStartE2EDuration="53.254764365s" podCreationTimestamp="2026-02-25 16:37:55 +0000 UTC" firstStartedPulling="2026-02-25 16:37:57.074231962 +0000 UTC m=+3128.087623852" lastFinishedPulling="2026-02-25 16:38:47.085416717 +0000 UTC m=+3178.098808647" observedRunningTime="2026-02-25 16:38:48.240002305 +0000 UTC m=+3179.253394205" watchObservedRunningTime="2026-02-25 16:38:48.254764365 +0000 UTC m=+3179.268156245" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.264369 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27721a82-c9aa-467b-a6d7-8161323b0989-catalog-content\") pod \"redhat-marketplace-wpgnj\" (UID: \"27721a82-c9aa-467b-a6d7-8161323b0989\") " pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.264507 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27721a82-c9aa-467b-a6d7-8161323b0989-utilities\") pod \"redhat-marketplace-wpgnj\" (UID: \"27721a82-c9aa-467b-a6d7-8161323b0989\") " pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.264622 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9fqp\" (UniqueName: \"kubernetes.io/projected/27721a82-c9aa-467b-a6d7-8161323b0989-kube-api-access-f9fqp\") pod \"redhat-marketplace-wpgnj\" (UID: \"27721a82-c9aa-467b-a6d7-8161323b0989\") " pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.265555 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27721a82-c9aa-467b-a6d7-8161323b0989-catalog-content\") pod \"redhat-marketplace-wpgnj\" (UID: \"27721a82-c9aa-467b-a6d7-8161323b0989\") " pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.265874 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27721a82-c9aa-467b-a6d7-8161323b0989-utilities\") pod \"redhat-marketplace-wpgnj\" (UID: \"27721a82-c9aa-467b-a6d7-8161323b0989\") " pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.285822 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k4dkk"] Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.286957 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9fqp\" (UniqueName: \"kubernetes.io/projected/27721a82-c9aa-467b-a6d7-8161323b0989-kube-api-access-f9fqp\") pod \"redhat-marketplace-wpgnj\" (UID: \"27721a82-c9aa-467b-a6d7-8161323b0989\") " pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.366406 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/078d89fe-7604-4fe0-9394-52ed747f56a0-utilities\") pod \"redhat-operators-k4dkk\" (UID: \"078d89fe-7604-4fe0-9394-52ed747f56a0\") " pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.368314 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/078d89fe-7604-4fe0-9394-52ed747f56a0-catalog-content\") pod \"redhat-operators-k4dkk\" (UID: \"078d89fe-7604-4fe0-9394-52ed747f56a0\") " pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.368823 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddrjv\" (UniqueName: \"kubernetes.io/projected/078d89fe-7604-4fe0-9394-52ed747f56a0-kube-api-access-ddrjv\") pod \"redhat-operators-k4dkk\" (UID: \"078d89fe-7604-4fe0-9394-52ed747f56a0\") " pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.383727 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.472185 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddrjv\" (UniqueName: \"kubernetes.io/projected/078d89fe-7604-4fe0-9394-52ed747f56a0-kube-api-access-ddrjv\") pod \"redhat-operators-k4dkk\" (UID: \"078d89fe-7604-4fe0-9394-52ed747f56a0\") " pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.473279 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/078d89fe-7604-4fe0-9394-52ed747f56a0-utilities\") pod \"redhat-operators-k4dkk\" (UID: \"078d89fe-7604-4fe0-9394-52ed747f56a0\") " pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.473410 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/078d89fe-7604-4fe0-9394-52ed747f56a0-catalog-content\") pod \"redhat-operators-k4dkk\" (UID: \"078d89fe-7604-4fe0-9394-52ed747f56a0\") " pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.473912 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/078d89fe-7604-4fe0-9394-52ed747f56a0-utilities\") pod \"redhat-operators-k4dkk\" (UID: \"078d89fe-7604-4fe0-9394-52ed747f56a0\") " pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.474003 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/078d89fe-7604-4fe0-9394-52ed747f56a0-catalog-content\") pod \"redhat-operators-k4dkk\" (UID: \"078d89fe-7604-4fe0-9394-52ed747f56a0\") " pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.493367 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddrjv\" (UniqueName: \"kubernetes.io/projected/078d89fe-7604-4fe0-9394-52ed747f56a0-kube-api-access-ddrjv\") pod \"redhat-operators-k4dkk\" (UID: \"078d89fe-7604-4fe0-9394-52ed747f56a0\") " pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.568093 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:38:48 crc kubenswrapper[4937]: I0225 16:38:48.933865 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpgnj"] Feb 25 16:38:48 crc kubenswrapper[4937]: W0225 16:38:48.936802 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27721a82_c9aa_467b_a6d7_8161323b0989.slice/crio-6737ed9218f4b3d4cfa6943f280873dc69c6af94fd171637fee1a0b386b9ccc8 WatchSource:0}: Error finding container 6737ed9218f4b3d4cfa6943f280873dc69c6af94fd171637fee1a0b386b9ccc8: Status 404 returned error can't find the container with id 6737ed9218f4b3d4cfa6943f280873dc69c6af94fd171637fee1a0b386b9ccc8 Feb 25 16:38:49 crc kubenswrapper[4937]: I0225 16:38:49.146235 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k4dkk"] Feb 25 16:38:49 crc kubenswrapper[4937]: I0225 16:38:49.223001 4937 generic.go:334] "Generic (PLEG): container finished" podID="27721a82-c9aa-467b-a6d7-8161323b0989" containerID="c801b87c77f81c08ffafbc9831a4d6886d54f35be918daaeefd4926fac92ebb4" exitCode=0 Feb 25 16:38:49 crc kubenswrapper[4937]: I0225 16:38:49.223051 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpgnj" event={"ID":"27721a82-c9aa-467b-a6d7-8161323b0989","Type":"ContainerDied","Data":"c801b87c77f81c08ffafbc9831a4d6886d54f35be918daaeefd4926fac92ebb4"} Feb 25 16:38:49 crc kubenswrapper[4937]: I0225 16:38:49.223100 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpgnj" event={"ID":"27721a82-c9aa-467b-a6d7-8161323b0989","Type":"ContainerStarted","Data":"6737ed9218f4b3d4cfa6943f280873dc69c6af94fd171637fee1a0b386b9ccc8"} Feb 25 16:38:49 crc kubenswrapper[4937]: I0225 16:38:49.230061 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4dkk" event={"ID":"078d89fe-7604-4fe0-9394-52ed747f56a0","Type":"ContainerStarted","Data":"d40d35dc2a011ac9441ac7487a4adeee3d311d6696995ea2f96b68deaf53c2ad"} Feb 25 16:38:50 crc kubenswrapper[4937]: I0225 16:38:50.243016 4937 generic.go:334] "Generic (PLEG): container finished" podID="078d89fe-7604-4fe0-9394-52ed747f56a0" containerID="22e4c9f7a99298967a7caef18003016cb66a1a8bf4210591492ccd423d17b57b" exitCode=0 Feb 25 16:38:50 crc kubenswrapper[4937]: I0225 16:38:50.243063 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4dkk" event={"ID":"078d89fe-7604-4fe0-9394-52ed747f56a0","Type":"ContainerDied","Data":"22e4c9f7a99298967a7caef18003016cb66a1a8bf4210591492ccd423d17b57b"} Feb 25 16:38:50 crc kubenswrapper[4937]: I0225 16:38:50.246852 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpgnj" event={"ID":"27721a82-c9aa-467b-a6d7-8161323b0989","Type":"ContainerStarted","Data":"557fa8cf9c6c0be279ec0bc3ecacc4957ebedf5704a74a961123d47122b8133d"} Feb 25 16:38:51 crc kubenswrapper[4937]: I0225 16:38:51.259059 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4dkk" event={"ID":"078d89fe-7604-4fe0-9394-52ed747f56a0","Type":"ContainerStarted","Data":"e537173937bdf4d1acc9b5cd2e80a66715d56bdd08984a93736ced77aa12504b"} Feb 25 16:38:51 crc kubenswrapper[4937]: I0225 16:38:51.261272 4937 generic.go:334] "Generic (PLEG): container finished" podID="27721a82-c9aa-467b-a6d7-8161323b0989" containerID="557fa8cf9c6c0be279ec0bc3ecacc4957ebedf5704a74a961123d47122b8133d" exitCode=0 Feb 25 16:38:51 crc kubenswrapper[4937]: I0225 16:38:51.261309 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpgnj" event={"ID":"27721a82-c9aa-467b-a6d7-8161323b0989","Type":"ContainerDied","Data":"557fa8cf9c6c0be279ec0bc3ecacc4957ebedf5704a74a961123d47122b8133d"} Feb 25 16:38:52 crc kubenswrapper[4937]: I0225 16:38:52.275553 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpgnj" event={"ID":"27721a82-c9aa-467b-a6d7-8161323b0989","Type":"ContainerStarted","Data":"9e954e1202573d075b59bb5392a8aa9df406c75f5d3adb5515d22687cc1c1d04"} Feb 25 16:38:52 crc kubenswrapper[4937]: I0225 16:38:52.298306 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wpgnj" podStartSLOduration=1.719879647 podStartE2EDuration="4.298285081s" podCreationTimestamp="2026-02-25 16:38:48 +0000 UTC" firstStartedPulling="2026-02-25 16:38:49.226376774 +0000 UTC m=+3180.239768664" lastFinishedPulling="2026-02-25 16:38:51.804782198 +0000 UTC m=+3182.818174098" observedRunningTime="2026-02-25 16:38:52.291441429 +0000 UTC m=+3183.304833329" watchObservedRunningTime="2026-02-25 16:38:52.298285081 +0000 UTC m=+3183.311676971" Feb 25 16:38:55 crc kubenswrapper[4937]: I0225 16:38:55.368592 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:38:55 crc kubenswrapper[4937]: E0225 16:38:55.369733 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:38:58 crc kubenswrapper[4937]: I0225 16:38:58.384532 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:58 crc kubenswrapper[4937]: I0225 16:38:58.385110 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:58 crc kubenswrapper[4937]: I0225 16:38:58.435246 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:59 crc kubenswrapper[4937]: I0225 16:38:59.424316 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:38:59 crc kubenswrapper[4937]: I0225 16:38:59.487411 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpgnj"] Feb 25 16:39:01 crc kubenswrapper[4937]: I0225 16:39:01.390366 4937 generic.go:334] "Generic (PLEG): container finished" podID="078d89fe-7604-4fe0-9394-52ed747f56a0" containerID="e537173937bdf4d1acc9b5cd2e80a66715d56bdd08984a93736ced77aa12504b" exitCode=0 Feb 25 16:39:01 crc kubenswrapper[4937]: I0225 16:39:01.391180 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wpgnj" podUID="27721a82-c9aa-467b-a6d7-8161323b0989" containerName="registry-server" containerID="cri-o://9e954e1202573d075b59bb5392a8aa9df406c75f5d3adb5515d22687cc1c1d04" gracePeriod=2 Feb 25 16:39:01 crc kubenswrapper[4937]: I0225 16:39:01.390462 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4dkk" event={"ID":"078d89fe-7604-4fe0-9394-52ed747f56a0","Type":"ContainerDied","Data":"e537173937bdf4d1acc9b5cd2e80a66715d56bdd08984a93736ced77aa12504b"} Feb 25 16:39:01 crc kubenswrapper[4937]: I0225 16:39:01.395842 4937 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.010475 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.105688 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27721a82-c9aa-467b-a6d7-8161323b0989-catalog-content\") pod \"27721a82-c9aa-467b-a6d7-8161323b0989\" (UID: \"27721a82-c9aa-467b-a6d7-8161323b0989\") " Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.106055 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27721a82-c9aa-467b-a6d7-8161323b0989-utilities\") pod \"27721a82-c9aa-467b-a6d7-8161323b0989\" (UID: \"27721a82-c9aa-467b-a6d7-8161323b0989\") " Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.106105 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9fqp\" (UniqueName: \"kubernetes.io/projected/27721a82-c9aa-467b-a6d7-8161323b0989-kube-api-access-f9fqp\") pod \"27721a82-c9aa-467b-a6d7-8161323b0989\" (UID: \"27721a82-c9aa-467b-a6d7-8161323b0989\") " Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.106789 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27721a82-c9aa-467b-a6d7-8161323b0989-utilities" (OuterVolumeSpecName: "utilities") pod "27721a82-c9aa-467b-a6d7-8161323b0989" (UID: "27721a82-c9aa-467b-a6d7-8161323b0989"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.115899 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27721a82-c9aa-467b-a6d7-8161323b0989-kube-api-access-f9fqp" (OuterVolumeSpecName: "kube-api-access-f9fqp") pod "27721a82-c9aa-467b-a6d7-8161323b0989" (UID: "27721a82-c9aa-467b-a6d7-8161323b0989"). InnerVolumeSpecName "kube-api-access-f9fqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.140638 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27721a82-c9aa-467b-a6d7-8161323b0989-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27721a82-c9aa-467b-a6d7-8161323b0989" (UID: "27721a82-c9aa-467b-a6d7-8161323b0989"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.208470 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27721a82-c9aa-467b-a6d7-8161323b0989-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.208519 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9fqp\" (UniqueName: \"kubernetes.io/projected/27721a82-c9aa-467b-a6d7-8161323b0989-kube-api-access-f9fqp\") on node \"crc\" DevicePath \"\"" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.208532 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27721a82-c9aa-467b-a6d7-8161323b0989-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.405515 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4dkk" event={"ID":"078d89fe-7604-4fe0-9394-52ed747f56a0","Type":"ContainerStarted","Data":"3fa0d2832049e2f6f1e1ae423466008ecb603e3d2a17ff31fd896726f7b2eaca"} Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.412253 4937 generic.go:334] "Generic (PLEG): container finished" podID="27721a82-c9aa-467b-a6d7-8161323b0989" containerID="9e954e1202573d075b59bb5392a8aa9df406c75f5d3adb5515d22687cc1c1d04" exitCode=0 Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.412346 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpgnj" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.412358 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpgnj" event={"ID":"27721a82-c9aa-467b-a6d7-8161323b0989","Type":"ContainerDied","Data":"9e954e1202573d075b59bb5392a8aa9df406c75f5d3adb5515d22687cc1c1d04"} Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.412435 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpgnj" event={"ID":"27721a82-c9aa-467b-a6d7-8161323b0989","Type":"ContainerDied","Data":"6737ed9218f4b3d4cfa6943f280873dc69c6af94fd171637fee1a0b386b9ccc8"} Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.412477 4937 scope.go:117] "RemoveContainer" containerID="9e954e1202573d075b59bb5392a8aa9df406c75f5d3adb5515d22687cc1c1d04" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.430970 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k4dkk" podStartSLOduration=2.850414591 podStartE2EDuration="14.430951622s" podCreationTimestamp="2026-02-25 16:38:48 +0000 UTC" firstStartedPulling="2026-02-25 16:38:50.245154622 +0000 UTC m=+3181.258546512" lastFinishedPulling="2026-02-25 16:39:01.825691653 +0000 UTC m=+3192.839083543" observedRunningTime="2026-02-25 16:39:02.42965437 +0000 UTC m=+3193.443046280" watchObservedRunningTime="2026-02-25 16:39:02.430951622 +0000 UTC m=+3193.444343512" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.457773 4937 scope.go:117] "RemoveContainer" containerID="557fa8cf9c6c0be279ec0bc3ecacc4957ebedf5704a74a961123d47122b8133d" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.481936 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpgnj"] Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.491391 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpgnj"] Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.494895 4937 scope.go:117] "RemoveContainer" containerID="c801b87c77f81c08ffafbc9831a4d6886d54f35be918daaeefd4926fac92ebb4" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.543377 4937 scope.go:117] "RemoveContainer" containerID="9e954e1202573d075b59bb5392a8aa9df406c75f5d3adb5515d22687cc1c1d04" Feb 25 16:39:02 crc kubenswrapper[4937]: E0225 16:39:02.544755 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e954e1202573d075b59bb5392a8aa9df406c75f5d3adb5515d22687cc1c1d04\": container with ID starting with 9e954e1202573d075b59bb5392a8aa9df406c75f5d3adb5515d22687cc1c1d04 not found: ID does not exist" containerID="9e954e1202573d075b59bb5392a8aa9df406c75f5d3adb5515d22687cc1c1d04" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.544816 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e954e1202573d075b59bb5392a8aa9df406c75f5d3adb5515d22687cc1c1d04"} err="failed to get container status \"9e954e1202573d075b59bb5392a8aa9df406c75f5d3adb5515d22687cc1c1d04\": rpc error: code = NotFound desc = could not find container \"9e954e1202573d075b59bb5392a8aa9df406c75f5d3adb5515d22687cc1c1d04\": container with ID starting with 9e954e1202573d075b59bb5392a8aa9df406c75f5d3adb5515d22687cc1c1d04 not found: ID does not exist" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.544852 4937 scope.go:117] "RemoveContainer" containerID="557fa8cf9c6c0be279ec0bc3ecacc4957ebedf5704a74a961123d47122b8133d" Feb 25 16:39:02 crc kubenswrapper[4937]: E0225 16:39:02.545529 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"557fa8cf9c6c0be279ec0bc3ecacc4957ebedf5704a74a961123d47122b8133d\": container with ID starting with 557fa8cf9c6c0be279ec0bc3ecacc4957ebedf5704a74a961123d47122b8133d not found: ID does not exist" containerID="557fa8cf9c6c0be279ec0bc3ecacc4957ebedf5704a74a961123d47122b8133d" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.545582 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"557fa8cf9c6c0be279ec0bc3ecacc4957ebedf5704a74a961123d47122b8133d"} err="failed to get container status \"557fa8cf9c6c0be279ec0bc3ecacc4957ebedf5704a74a961123d47122b8133d\": rpc error: code = NotFound desc = could not find container \"557fa8cf9c6c0be279ec0bc3ecacc4957ebedf5704a74a961123d47122b8133d\": container with ID starting with 557fa8cf9c6c0be279ec0bc3ecacc4957ebedf5704a74a961123d47122b8133d not found: ID does not exist" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.545619 4937 scope.go:117] "RemoveContainer" containerID="c801b87c77f81c08ffafbc9831a4d6886d54f35be918daaeefd4926fac92ebb4" Feb 25 16:39:02 crc kubenswrapper[4937]: E0225 16:39:02.546020 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c801b87c77f81c08ffafbc9831a4d6886d54f35be918daaeefd4926fac92ebb4\": container with ID starting with c801b87c77f81c08ffafbc9831a4d6886d54f35be918daaeefd4926fac92ebb4 not found: ID does not exist" containerID="c801b87c77f81c08ffafbc9831a4d6886d54f35be918daaeefd4926fac92ebb4" Feb 25 16:39:02 crc kubenswrapper[4937]: I0225 16:39:02.546053 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c801b87c77f81c08ffafbc9831a4d6886d54f35be918daaeefd4926fac92ebb4"} err="failed to get container status \"c801b87c77f81c08ffafbc9831a4d6886d54f35be918daaeefd4926fac92ebb4\": rpc error: code = NotFound desc = could not find container \"c801b87c77f81c08ffafbc9831a4d6886d54f35be918daaeefd4926fac92ebb4\": container with ID starting with c801b87c77f81c08ffafbc9831a4d6886d54f35be918daaeefd4926fac92ebb4 not found: ID does not exist" Feb 25 16:39:03 crc kubenswrapper[4937]: I0225 16:39:03.381049 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27721a82-c9aa-467b-a6d7-8161323b0989" path="/var/lib/kubelet/pods/27721a82-c9aa-467b-a6d7-8161323b0989/volumes" Feb 25 16:39:07 crc kubenswrapper[4937]: I0225 16:39:07.368836 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:39:07 crc kubenswrapper[4937]: E0225 16:39:07.370160 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:39:08 crc kubenswrapper[4937]: I0225 16:39:08.569713 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:39:08 crc kubenswrapper[4937]: I0225 16:39:08.570115 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:39:09 crc kubenswrapper[4937]: I0225 16:39:09.645525 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k4dkk" podUID="078d89fe-7604-4fe0-9394-52ed747f56a0" containerName="registry-server" probeResult="failure" output=< Feb 25 16:39:09 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 16:39:09 crc kubenswrapper[4937]: > Feb 25 16:39:18 crc kubenswrapper[4937]: I0225 16:39:18.367978 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:39:18 crc kubenswrapper[4937]: E0225 16:39:18.369758 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:39:18 crc kubenswrapper[4937]: I0225 16:39:18.642855 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:39:18 crc kubenswrapper[4937]: I0225 16:39:18.705540 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:39:19 crc kubenswrapper[4937]: I0225 16:39:19.244394 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k4dkk"] Feb 25 16:39:20 crc kubenswrapper[4937]: I0225 16:39:20.651657 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k4dkk" podUID="078d89fe-7604-4fe0-9394-52ed747f56a0" containerName="registry-server" containerID="cri-o://3fa0d2832049e2f6f1e1ae423466008ecb603e3d2a17ff31fd896726f7b2eaca" gracePeriod=2 Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.219878 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.315699 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddrjv\" (UniqueName: \"kubernetes.io/projected/078d89fe-7604-4fe0-9394-52ed747f56a0-kube-api-access-ddrjv\") pod \"078d89fe-7604-4fe0-9394-52ed747f56a0\" (UID: \"078d89fe-7604-4fe0-9394-52ed747f56a0\") " Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.315949 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/078d89fe-7604-4fe0-9394-52ed747f56a0-catalog-content\") pod \"078d89fe-7604-4fe0-9394-52ed747f56a0\" (UID: \"078d89fe-7604-4fe0-9394-52ed747f56a0\") " Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.315982 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/078d89fe-7604-4fe0-9394-52ed747f56a0-utilities\") pod \"078d89fe-7604-4fe0-9394-52ed747f56a0\" (UID: \"078d89fe-7604-4fe0-9394-52ed747f56a0\") " Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.316807 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/078d89fe-7604-4fe0-9394-52ed747f56a0-utilities" (OuterVolumeSpecName: "utilities") pod "078d89fe-7604-4fe0-9394-52ed747f56a0" (UID: "078d89fe-7604-4fe0-9394-52ed747f56a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.322695 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/078d89fe-7604-4fe0-9394-52ed747f56a0-kube-api-access-ddrjv" (OuterVolumeSpecName: "kube-api-access-ddrjv") pod "078d89fe-7604-4fe0-9394-52ed747f56a0" (UID: "078d89fe-7604-4fe0-9394-52ed747f56a0"). InnerVolumeSpecName "kube-api-access-ddrjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.420254 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/078d89fe-7604-4fe0-9394-52ed747f56a0-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.420289 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddrjv\" (UniqueName: \"kubernetes.io/projected/078d89fe-7604-4fe0-9394-52ed747f56a0-kube-api-access-ddrjv\") on node \"crc\" DevicePath \"\"" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.452206 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/078d89fe-7604-4fe0-9394-52ed747f56a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "078d89fe-7604-4fe0-9394-52ed747f56a0" (UID: "078d89fe-7604-4fe0-9394-52ed747f56a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.521642 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/078d89fe-7604-4fe0-9394-52ed747f56a0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.671200 4937 generic.go:334] "Generic (PLEG): container finished" podID="078d89fe-7604-4fe0-9394-52ed747f56a0" containerID="3fa0d2832049e2f6f1e1ae423466008ecb603e3d2a17ff31fd896726f7b2eaca" exitCode=0 Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.671247 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4dkk" event={"ID":"078d89fe-7604-4fe0-9394-52ed747f56a0","Type":"ContainerDied","Data":"3fa0d2832049e2f6f1e1ae423466008ecb603e3d2a17ff31fd896726f7b2eaca"} Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.671290 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4dkk" event={"ID":"078d89fe-7604-4fe0-9394-52ed747f56a0","Type":"ContainerDied","Data":"d40d35dc2a011ac9441ac7487a4adeee3d311d6696995ea2f96b68deaf53c2ad"} Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.671316 4937 scope.go:117] "RemoveContainer" containerID="3fa0d2832049e2f6f1e1ae423466008ecb603e3d2a17ff31fd896726f7b2eaca" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.671392 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4dkk" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.694364 4937 scope.go:117] "RemoveContainer" containerID="e537173937bdf4d1acc9b5cd2e80a66715d56bdd08984a93736ced77aa12504b" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.718523 4937 scope.go:117] "RemoveContainer" containerID="22e4c9f7a99298967a7caef18003016cb66a1a8bf4210591492ccd423d17b57b" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.721266 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k4dkk"] Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.731236 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k4dkk"] Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.771873 4937 scope.go:117] "RemoveContainer" containerID="3fa0d2832049e2f6f1e1ae423466008ecb603e3d2a17ff31fd896726f7b2eaca" Feb 25 16:39:21 crc kubenswrapper[4937]: E0225 16:39:21.772434 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fa0d2832049e2f6f1e1ae423466008ecb603e3d2a17ff31fd896726f7b2eaca\": container with ID starting with 3fa0d2832049e2f6f1e1ae423466008ecb603e3d2a17ff31fd896726f7b2eaca not found: ID does not exist" containerID="3fa0d2832049e2f6f1e1ae423466008ecb603e3d2a17ff31fd896726f7b2eaca" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.772495 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fa0d2832049e2f6f1e1ae423466008ecb603e3d2a17ff31fd896726f7b2eaca"} err="failed to get container status \"3fa0d2832049e2f6f1e1ae423466008ecb603e3d2a17ff31fd896726f7b2eaca\": rpc error: code = NotFound desc = could not find container \"3fa0d2832049e2f6f1e1ae423466008ecb603e3d2a17ff31fd896726f7b2eaca\": container with ID starting with 3fa0d2832049e2f6f1e1ae423466008ecb603e3d2a17ff31fd896726f7b2eaca not found: ID does not exist" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.772523 4937 scope.go:117] "RemoveContainer" containerID="e537173937bdf4d1acc9b5cd2e80a66715d56bdd08984a93736ced77aa12504b" Feb 25 16:39:21 crc kubenswrapper[4937]: E0225 16:39:21.772979 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e537173937bdf4d1acc9b5cd2e80a66715d56bdd08984a93736ced77aa12504b\": container with ID starting with e537173937bdf4d1acc9b5cd2e80a66715d56bdd08984a93736ced77aa12504b not found: ID does not exist" containerID="e537173937bdf4d1acc9b5cd2e80a66715d56bdd08984a93736ced77aa12504b" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.773013 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e537173937bdf4d1acc9b5cd2e80a66715d56bdd08984a93736ced77aa12504b"} err="failed to get container status \"e537173937bdf4d1acc9b5cd2e80a66715d56bdd08984a93736ced77aa12504b\": rpc error: code = NotFound desc = could not find container \"e537173937bdf4d1acc9b5cd2e80a66715d56bdd08984a93736ced77aa12504b\": container with ID starting with e537173937bdf4d1acc9b5cd2e80a66715d56bdd08984a93736ced77aa12504b not found: ID does not exist" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.773031 4937 scope.go:117] "RemoveContainer" containerID="22e4c9f7a99298967a7caef18003016cb66a1a8bf4210591492ccd423d17b57b" Feb 25 16:39:21 crc kubenswrapper[4937]: E0225 16:39:21.773279 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e4c9f7a99298967a7caef18003016cb66a1a8bf4210591492ccd423d17b57b\": container with ID starting with 22e4c9f7a99298967a7caef18003016cb66a1a8bf4210591492ccd423d17b57b not found: ID does not exist" containerID="22e4c9f7a99298967a7caef18003016cb66a1a8bf4210591492ccd423d17b57b" Feb 25 16:39:21 crc kubenswrapper[4937]: I0225 16:39:21.773310 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e4c9f7a99298967a7caef18003016cb66a1a8bf4210591492ccd423d17b57b"} err="failed to get container status \"22e4c9f7a99298967a7caef18003016cb66a1a8bf4210591492ccd423d17b57b\": rpc error: code = NotFound desc = could not find container \"22e4c9f7a99298967a7caef18003016cb66a1a8bf4210591492ccd423d17b57b\": container with ID starting with 22e4c9f7a99298967a7caef18003016cb66a1a8bf4210591492ccd423d17b57b not found: ID does not exist" Feb 25 16:39:23 crc kubenswrapper[4937]: I0225 16:39:23.383147 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="078d89fe-7604-4fe0-9394-52ed747f56a0" path="/var/lib/kubelet/pods/078d89fe-7604-4fe0-9394-52ed747f56a0/volumes" Feb 25 16:39:31 crc kubenswrapper[4937]: I0225 16:39:31.377388 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:39:31 crc kubenswrapper[4937]: E0225 16:39:31.380231 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:39:42 crc kubenswrapper[4937]: I0225 16:39:42.368970 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:39:42 crc kubenswrapper[4937]: E0225 16:39:42.369679 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:39:53 crc kubenswrapper[4937]: I0225 16:39:53.368232 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:39:53 crc kubenswrapper[4937]: E0225 16:39:53.369360 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.148685 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533960-g7p4h"] Feb 25 16:40:00 crc kubenswrapper[4937]: E0225 16:40:00.149320 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27721a82-c9aa-467b-a6d7-8161323b0989" containerName="extract-content" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.149330 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="27721a82-c9aa-467b-a6d7-8161323b0989" containerName="extract-content" Feb 25 16:40:00 crc kubenswrapper[4937]: E0225 16:40:00.149342 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27721a82-c9aa-467b-a6d7-8161323b0989" containerName="registry-server" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.149347 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="27721a82-c9aa-467b-a6d7-8161323b0989" containerName="registry-server" Feb 25 16:40:00 crc kubenswrapper[4937]: E0225 16:40:00.149354 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="078d89fe-7604-4fe0-9394-52ed747f56a0" containerName="registry-server" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.149360 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="078d89fe-7604-4fe0-9394-52ed747f56a0" containerName="registry-server" Feb 25 16:40:00 crc kubenswrapper[4937]: E0225 16:40:00.149367 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27721a82-c9aa-467b-a6d7-8161323b0989" containerName="extract-utilities" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.149372 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="27721a82-c9aa-467b-a6d7-8161323b0989" containerName="extract-utilities" Feb 25 16:40:00 crc kubenswrapper[4937]: E0225 16:40:00.149386 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="078d89fe-7604-4fe0-9394-52ed747f56a0" containerName="extract-content" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.149391 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="078d89fe-7604-4fe0-9394-52ed747f56a0" containerName="extract-content" Feb 25 16:40:00 crc kubenswrapper[4937]: E0225 16:40:00.149424 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="078d89fe-7604-4fe0-9394-52ed747f56a0" containerName="extract-utilities" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.149429 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="078d89fe-7604-4fe0-9394-52ed747f56a0" containerName="extract-utilities" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.149614 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="27721a82-c9aa-467b-a6d7-8161323b0989" containerName="registry-server" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.149638 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="078d89fe-7604-4fe0-9394-52ed747f56a0" containerName="registry-server" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.150327 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533960-g7p4h" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.152748 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.153713 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.153827 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.167789 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533960-g7p4h"] Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.282323 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xt75\" (UniqueName: \"kubernetes.io/projected/b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b-kube-api-access-6xt75\") pod \"auto-csr-approver-29533960-g7p4h\" (UID: \"b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b\") " pod="openshift-infra/auto-csr-approver-29533960-g7p4h" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.385105 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xt75\" (UniqueName: \"kubernetes.io/projected/b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b-kube-api-access-6xt75\") pod \"auto-csr-approver-29533960-g7p4h\" (UID: \"b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b\") " pod="openshift-infra/auto-csr-approver-29533960-g7p4h" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.403669 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xt75\" (UniqueName: \"kubernetes.io/projected/b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b-kube-api-access-6xt75\") pod \"auto-csr-approver-29533960-g7p4h\" (UID: \"b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b\") " pod="openshift-infra/auto-csr-approver-29533960-g7p4h" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.474526 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533960-g7p4h" Feb 25 16:40:00 crc kubenswrapper[4937]: I0225 16:40:00.935180 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533960-g7p4h"] Feb 25 16:40:01 crc kubenswrapper[4937]: I0225 16:40:01.058686 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533960-g7p4h" event={"ID":"b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b","Type":"ContainerStarted","Data":"9d581ac66f68ed0b01002e774a3c23667c9d660a82490db3a1c3e3339c4c999b"} Feb 25 16:40:03 crc kubenswrapper[4937]: I0225 16:40:03.093011 4937 generic.go:334] "Generic (PLEG): container finished" podID="b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b" containerID="2680c959e1b83254b0c4e7a20d96a327e2affa1f6a82c163ec8e25814331b94a" exitCode=0 Feb 25 16:40:03 crc kubenswrapper[4937]: I0225 16:40:03.093090 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533960-g7p4h" event={"ID":"b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b","Type":"ContainerDied","Data":"2680c959e1b83254b0c4e7a20d96a327e2affa1f6a82c163ec8e25814331b94a"} Feb 25 16:40:04 crc kubenswrapper[4937]: I0225 16:40:04.612852 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533960-g7p4h" Feb 25 16:40:04 crc kubenswrapper[4937]: I0225 16:40:04.676791 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xt75\" (UniqueName: \"kubernetes.io/projected/b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b-kube-api-access-6xt75\") pod \"b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b\" (UID: \"b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b\") " Feb 25 16:40:04 crc kubenswrapper[4937]: I0225 16:40:04.713957 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b-kube-api-access-6xt75" (OuterVolumeSpecName: "kube-api-access-6xt75") pod "b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b" (UID: "b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b"). InnerVolumeSpecName "kube-api-access-6xt75". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:40:04 crc kubenswrapper[4937]: I0225 16:40:04.779475 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xt75\" (UniqueName: \"kubernetes.io/projected/b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b-kube-api-access-6xt75\") on node \"crc\" DevicePath \"\"" Feb 25 16:40:05 crc kubenswrapper[4937]: I0225 16:40:05.117207 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533960-g7p4h" event={"ID":"b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b","Type":"ContainerDied","Data":"9d581ac66f68ed0b01002e774a3c23667c9d660a82490db3a1c3e3339c4c999b"} Feb 25 16:40:05 crc kubenswrapper[4937]: I0225 16:40:05.117249 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d581ac66f68ed0b01002e774a3c23667c9d660a82490db3a1c3e3339c4c999b" Feb 25 16:40:05 crc kubenswrapper[4937]: I0225 16:40:05.117314 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533960-g7p4h" Feb 25 16:40:05 crc kubenswrapper[4937]: I0225 16:40:05.682638 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533954-x95fl"] Feb 25 16:40:05 crc kubenswrapper[4937]: I0225 16:40:05.697211 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533954-x95fl"] Feb 25 16:40:06 crc kubenswrapper[4937]: I0225 16:40:06.367331 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:40:06 crc kubenswrapper[4937]: E0225 16:40:06.367627 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:40:07 crc kubenswrapper[4937]: I0225 16:40:07.379929 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6fae4fd-13ae-41d5-820a-90e4b0ebaae5" path="/var/lib/kubelet/pods/a6fae4fd-13ae-41d5-820a-90e4b0ebaae5/volumes" Feb 25 16:40:18 crc kubenswrapper[4937]: I0225 16:40:18.367271 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:40:18 crc kubenswrapper[4937]: E0225 16:40:18.368153 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:40:32 crc kubenswrapper[4937]: I0225 16:40:32.367521 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:40:32 crc kubenswrapper[4937]: E0225 16:40:32.368338 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:40:42 crc kubenswrapper[4937]: I0225 16:40:42.331715 4937 scope.go:117] "RemoveContainer" containerID="3d6b4ab9385d09ae3e498c8979fbd74da96b60bb98e0e2b7f49047e24f5e6225" Feb 25 16:40:43 crc kubenswrapper[4937]: I0225 16:40:43.367278 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:40:43 crc kubenswrapper[4937]: E0225 16:40:43.368364 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:40:56 crc kubenswrapper[4937]: I0225 16:40:56.368469 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:40:56 crc kubenswrapper[4937]: E0225 16:40:56.371131 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:41:11 crc kubenswrapper[4937]: I0225 16:41:11.382183 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:41:11 crc kubenswrapper[4937]: E0225 16:41:11.384026 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:41:22 crc kubenswrapper[4937]: I0225 16:41:22.368069 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:41:22 crc kubenswrapper[4937]: E0225 16:41:22.368899 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:41:34 crc kubenswrapper[4937]: I0225 16:41:34.368606 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:41:34 crc kubenswrapper[4937]: E0225 16:41:34.369356 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:41:45 crc kubenswrapper[4937]: I0225 16:41:45.367913 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:41:45 crc kubenswrapper[4937]: E0225 16:41:45.368527 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:41:58 crc kubenswrapper[4937]: I0225 16:41:58.368649 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:41:58 crc kubenswrapper[4937]: E0225 16:41:58.371310 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:42:00 crc kubenswrapper[4937]: I0225 16:42:00.196603 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533962-szqn9"] Feb 25 16:42:00 crc kubenswrapper[4937]: E0225 16:42:00.197634 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b" containerName="oc" Feb 25 16:42:00 crc kubenswrapper[4937]: I0225 16:42:00.197652 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b" containerName="oc" Feb 25 16:42:00 crc kubenswrapper[4937]: I0225 16:42:00.197945 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b" containerName="oc" Feb 25 16:42:00 crc kubenswrapper[4937]: I0225 16:42:00.198937 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533962-szqn9" Feb 25 16:42:00 crc kubenswrapper[4937]: I0225 16:42:00.201129 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:42:00 crc kubenswrapper[4937]: I0225 16:42:00.205216 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:42:00 crc kubenswrapper[4937]: I0225 16:42:00.205253 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:42:00 crc kubenswrapper[4937]: I0225 16:42:00.209818 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533962-szqn9"] Feb 25 16:42:00 crc kubenswrapper[4937]: I0225 16:42:00.283897 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj9kq\" (UniqueName: \"kubernetes.io/projected/90e245e2-fcd1-419a-b30f-e248e047cae5-kube-api-access-kj9kq\") pod \"auto-csr-approver-29533962-szqn9\" (UID: \"90e245e2-fcd1-419a-b30f-e248e047cae5\") " pod="openshift-infra/auto-csr-approver-29533962-szqn9" Feb 25 16:42:00 crc kubenswrapper[4937]: I0225 16:42:00.385520 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj9kq\" (UniqueName: \"kubernetes.io/projected/90e245e2-fcd1-419a-b30f-e248e047cae5-kube-api-access-kj9kq\") pod \"auto-csr-approver-29533962-szqn9\" (UID: \"90e245e2-fcd1-419a-b30f-e248e047cae5\") " pod="openshift-infra/auto-csr-approver-29533962-szqn9" Feb 25 16:42:00 crc kubenswrapper[4937]: I0225 16:42:00.408335 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj9kq\" (UniqueName: \"kubernetes.io/projected/90e245e2-fcd1-419a-b30f-e248e047cae5-kube-api-access-kj9kq\") pod \"auto-csr-approver-29533962-szqn9\" (UID: \"90e245e2-fcd1-419a-b30f-e248e047cae5\") " pod="openshift-infra/auto-csr-approver-29533962-szqn9" Feb 25 16:42:00 crc kubenswrapper[4937]: I0225 16:42:00.521294 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533962-szqn9" Feb 25 16:42:01 crc kubenswrapper[4937]: I0225 16:42:01.013354 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533962-szqn9"] Feb 25 16:42:01 crc kubenswrapper[4937]: I0225 16:42:01.132015 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533962-szqn9" event={"ID":"90e245e2-fcd1-419a-b30f-e248e047cae5","Type":"ContainerStarted","Data":"955f5dd47e65878a7b42cf88997cb30ba21a194b9fa1929d07001bfed2dfee81"} Feb 25 16:42:04 crc kubenswrapper[4937]: I0225 16:42:04.167151 4937 generic.go:334] "Generic (PLEG): container finished" podID="90e245e2-fcd1-419a-b30f-e248e047cae5" containerID="900bd544d6d636b7f5f1e4be8069ace7ca5db0366b511643af6129cb65a82a63" exitCode=0 Feb 25 16:42:04 crc kubenswrapper[4937]: I0225 16:42:04.167279 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533962-szqn9" event={"ID":"90e245e2-fcd1-419a-b30f-e248e047cae5","Type":"ContainerDied","Data":"900bd544d6d636b7f5f1e4be8069ace7ca5db0366b511643af6129cb65a82a63"} Feb 25 16:42:05 crc kubenswrapper[4937]: I0225 16:42:05.882153 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533962-szqn9" Feb 25 16:42:06 crc kubenswrapper[4937]: I0225 16:42:06.035299 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj9kq\" (UniqueName: \"kubernetes.io/projected/90e245e2-fcd1-419a-b30f-e248e047cae5-kube-api-access-kj9kq\") pod \"90e245e2-fcd1-419a-b30f-e248e047cae5\" (UID: \"90e245e2-fcd1-419a-b30f-e248e047cae5\") " Feb 25 16:42:06 crc kubenswrapper[4937]: I0225 16:42:06.049351 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e245e2-fcd1-419a-b30f-e248e047cae5-kube-api-access-kj9kq" (OuterVolumeSpecName: "kube-api-access-kj9kq") pod "90e245e2-fcd1-419a-b30f-e248e047cae5" (UID: "90e245e2-fcd1-419a-b30f-e248e047cae5"). InnerVolumeSpecName "kube-api-access-kj9kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:42:06 crc kubenswrapper[4937]: I0225 16:42:06.141275 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj9kq\" (UniqueName: \"kubernetes.io/projected/90e245e2-fcd1-419a-b30f-e248e047cae5-kube-api-access-kj9kq\") on node \"crc\" DevicePath \"\"" Feb 25 16:42:06 crc kubenswrapper[4937]: I0225 16:42:06.190918 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533962-szqn9" event={"ID":"90e245e2-fcd1-419a-b30f-e248e047cae5","Type":"ContainerDied","Data":"955f5dd47e65878a7b42cf88997cb30ba21a194b9fa1929d07001bfed2dfee81"} Feb 25 16:42:06 crc kubenswrapper[4937]: I0225 16:42:06.190974 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="955f5dd47e65878a7b42cf88997cb30ba21a194b9fa1929d07001bfed2dfee81" Feb 25 16:42:06 crc kubenswrapper[4937]: I0225 16:42:06.190978 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533962-szqn9" Feb 25 16:42:06 crc kubenswrapper[4937]: I0225 16:42:06.964875 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533956-wgbtn"] Feb 25 16:42:06 crc kubenswrapper[4937]: I0225 16:42:06.977540 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533956-wgbtn"] Feb 25 16:42:07 crc kubenswrapper[4937]: I0225 16:42:07.379310 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43c48c19-789b-4324-9f62-e7d421eb8595" path="/var/lib/kubelet/pods/43c48c19-789b-4324-9f62-e7d421eb8595/volumes" Feb 25 16:42:09 crc kubenswrapper[4937]: I0225 16:42:09.367982 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:42:09 crc kubenswrapper[4937]: E0225 16:42:09.368571 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:42:24 crc kubenswrapper[4937]: I0225 16:42:24.374953 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:42:25 crc kubenswrapper[4937]: I0225 16:42:25.431924 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"c523637845c15bc142c970843436a3634d3a2d1727208c0649d730f119f41f73"} Feb 25 16:42:42 crc kubenswrapper[4937]: I0225 16:42:42.845561 4937 scope.go:117] "RemoveContainer" containerID="a28f6ebff7ba476bb4bd8474e60db715854742225d3aadaba9533a9f9ad4835e" Feb 25 16:43:37 crc kubenswrapper[4937]: I0225 16:43:37.148439 4937 generic.go:334] "Generic (PLEG): container finished" podID="994bcbfb-8270-42b1-bc77-6a262f2d29e3" containerID="7de2c2135b975ec78bef7ded176cdec997b2237395b1a411360dcc097e1f0896" exitCode=0 Feb 25 16:43:37 crc kubenswrapper[4937]: I0225 16:43:37.148575 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"994bcbfb-8270-42b1-bc77-6a262f2d29e3","Type":"ContainerDied","Data":"7de2c2135b975ec78bef7ded176cdec997b2237395b1a411360dcc097e1f0896"} Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.782692 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.949245 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xht6\" (UniqueName: \"kubernetes.io/projected/994bcbfb-8270-42b1-bc77-6a262f2d29e3-kube-api-access-4xht6\") pod \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.949360 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/994bcbfb-8270-42b1-bc77-6a262f2d29e3-config-data\") pod \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.949423 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-openstack-config-secret\") pod \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.949449 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-ssh-key\") pod \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.949556 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/994bcbfb-8270-42b1-bc77-6a262f2d29e3-openstack-config\") pod \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.949589 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/994bcbfb-8270-42b1-bc77-6a262f2d29e3-test-operator-ephemeral-workdir\") pod \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.949638 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-ca-certs\") pod \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.949737 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/994bcbfb-8270-42b1-bc77-6a262f2d29e3-test-operator-ephemeral-temporary\") pod \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.949776 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\" (UID: \"994bcbfb-8270-42b1-bc77-6a262f2d29e3\") " Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.950226 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/994bcbfb-8270-42b1-bc77-6a262f2d29e3-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "994bcbfb-8270-42b1-bc77-6a262f2d29e3" (UID: "994bcbfb-8270-42b1-bc77-6a262f2d29e3"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.951234 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/994bcbfb-8270-42b1-bc77-6a262f2d29e3-config-data" (OuterVolumeSpecName: "config-data") pod "994bcbfb-8270-42b1-bc77-6a262f2d29e3" (UID: "994bcbfb-8270-42b1-bc77-6a262f2d29e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.951240 4937 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/994bcbfb-8270-42b1-bc77-6a262f2d29e3-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.955431 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "994bcbfb-8270-42b1-bc77-6a262f2d29e3" (UID: "994bcbfb-8270-42b1-bc77-6a262f2d29e3"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.955977 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/994bcbfb-8270-42b1-bc77-6a262f2d29e3-kube-api-access-4xht6" (OuterVolumeSpecName: "kube-api-access-4xht6") pod "994bcbfb-8270-42b1-bc77-6a262f2d29e3" (UID: "994bcbfb-8270-42b1-bc77-6a262f2d29e3"). InnerVolumeSpecName "kube-api-access-4xht6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.978679 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "994bcbfb-8270-42b1-bc77-6a262f2d29e3" (UID: "994bcbfb-8270-42b1-bc77-6a262f2d29e3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.979263 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "994bcbfb-8270-42b1-bc77-6a262f2d29e3" (UID: "994bcbfb-8270-42b1-bc77-6a262f2d29e3"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:43:38 crc kubenswrapper[4937]: I0225 16:43:38.991027 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "994bcbfb-8270-42b1-bc77-6a262f2d29e3" (UID: "994bcbfb-8270-42b1-bc77-6a262f2d29e3"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.014741 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/994bcbfb-8270-42b1-bc77-6a262f2d29e3-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "994bcbfb-8270-42b1-bc77-6a262f2d29e3" (UID: "994bcbfb-8270-42b1-bc77-6a262f2d29e3"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.053295 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/994bcbfb-8270-42b1-bc77-6a262f2d29e3-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.053333 4937 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.053344 4937 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.053354 4937 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/994bcbfb-8270-42b1-bc77-6a262f2d29e3-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.053366 4937 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/994bcbfb-8270-42b1-bc77-6a262f2d29e3-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.053394 4937 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.053406 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xht6\" (UniqueName: \"kubernetes.io/projected/994bcbfb-8270-42b1-bc77-6a262f2d29e3-kube-api-access-4xht6\") on node \"crc\" DevicePath \"\"" Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.075351 4937 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.160176 4937 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.172265 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"994bcbfb-8270-42b1-bc77-6a262f2d29e3","Type":"ContainerDied","Data":"1a8387000143fa1af1b551b73d2f1c232d4c67c2ac9fe5487ac633d74f453730"} Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.172310 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a8387000143fa1af1b551b73d2f1c232d4c67c2ac9fe5487ac633d74f453730" Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.172580 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.336786 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/994bcbfb-8270-42b1-bc77-6a262f2d29e3-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "994bcbfb-8270-42b1-bc77-6a262f2d29e3" (UID: "994bcbfb-8270-42b1-bc77-6a262f2d29e3"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:43:39 crc kubenswrapper[4937]: I0225 16:43:39.368228 4937 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/994bcbfb-8270-42b1-bc77-6a262f2d29e3-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.614954 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 25 16:43:41 crc kubenswrapper[4937]: E0225 16:43:41.615865 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e245e2-fcd1-419a-b30f-e248e047cae5" containerName="oc" Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.615887 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e245e2-fcd1-419a-b30f-e248e047cae5" containerName="oc" Feb 25 16:43:41 crc kubenswrapper[4937]: E0225 16:43:41.615921 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994bcbfb-8270-42b1-bc77-6a262f2d29e3" containerName="tempest-tests-tempest-tests-runner" Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.615930 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="994bcbfb-8270-42b1-bc77-6a262f2d29e3" containerName="tempest-tests-tempest-tests-runner" Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.616185 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="994bcbfb-8270-42b1-bc77-6a262f2d29e3" containerName="tempest-tests-tempest-tests-runner" Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.616211 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e245e2-fcd1-419a-b30f-e248e047cae5" containerName="oc" Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.617136 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.622110 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8pfdg" Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.650061 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.816690 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdvlw\" (UniqueName: \"kubernetes.io/projected/f62befdf-83b6-4767-8de5-d552bb54e3f9-kube-api-access-mdvlw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f62befdf-83b6-4767-8de5-d552bb54e3f9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.816856 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f62befdf-83b6-4767-8de5-d552bb54e3f9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.918444 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdvlw\" (UniqueName: \"kubernetes.io/projected/f62befdf-83b6-4767-8de5-d552bb54e3f9-kube-api-access-mdvlw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f62befdf-83b6-4767-8de5-d552bb54e3f9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.918676 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f62befdf-83b6-4767-8de5-d552bb54e3f9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.919111 4937 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f62befdf-83b6-4767-8de5-d552bb54e3f9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.937665 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdvlw\" (UniqueName: \"kubernetes.io/projected/f62befdf-83b6-4767-8de5-d552bb54e3f9-kube-api-access-mdvlw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f62befdf-83b6-4767-8de5-d552bb54e3f9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 16:43:41 crc kubenswrapper[4937]: I0225 16:43:41.951731 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f62befdf-83b6-4767-8de5-d552bb54e3f9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 16:43:42 crc kubenswrapper[4937]: I0225 16:43:42.249836 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 16:43:43 crc kubenswrapper[4937]: I0225 16:43:43.332270 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 25 16:43:44 crc kubenswrapper[4937]: I0225 16:43:44.312255 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"f62befdf-83b6-4767-8de5-d552bb54e3f9","Type":"ContainerStarted","Data":"488e9979cd8e94433f8e33149a4b264527a317b517523d363ee03eff0b5a5e43"} Feb 25 16:43:45 crc kubenswrapper[4937]: I0225 16:43:45.324327 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"f62befdf-83b6-4767-8de5-d552bb54e3f9","Type":"ContainerStarted","Data":"1adaa2b7c3a7f2f16da3f6196b8b7cf8deea7d9773a39de392c0f39c42b7842d"} Feb 25 16:43:45 crc kubenswrapper[4937]: I0225 16:43:45.338415 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=3.53171236 podStartE2EDuration="4.338395935s" podCreationTimestamp="2026-02-25 16:43:41 +0000 UTC" firstStartedPulling="2026-02-25 16:43:43.347402197 +0000 UTC m=+3474.360794087" lastFinishedPulling="2026-02-25 16:43:44.154085772 +0000 UTC m=+3475.167477662" observedRunningTime="2026-02-25 16:43:45.33780998 +0000 UTC m=+3476.351201870" watchObservedRunningTime="2026-02-25 16:43:45.338395935 +0000 UTC m=+3476.351787835" Feb 25 16:44:00 crc kubenswrapper[4937]: I0225 16:44:00.144766 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533964-rrcl8"] Feb 25 16:44:00 crc kubenswrapper[4937]: I0225 16:44:00.146944 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533964-rrcl8" Feb 25 16:44:00 crc kubenswrapper[4937]: I0225 16:44:00.149578 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:44:00 crc kubenswrapper[4937]: I0225 16:44:00.150095 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:44:00 crc kubenswrapper[4937]: I0225 16:44:00.150723 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:44:00 crc kubenswrapper[4937]: I0225 16:44:00.158584 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533964-rrcl8"] Feb 25 16:44:00 crc kubenswrapper[4937]: I0225 16:44:00.244035 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-964zz\" (UniqueName: \"kubernetes.io/projected/261e5bc8-bfe5-4036-8599-19cd5c173b63-kube-api-access-964zz\") pod \"auto-csr-approver-29533964-rrcl8\" (UID: \"261e5bc8-bfe5-4036-8599-19cd5c173b63\") " pod="openshift-infra/auto-csr-approver-29533964-rrcl8" Feb 25 16:44:00 crc kubenswrapper[4937]: I0225 16:44:00.346557 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-964zz\" (UniqueName: \"kubernetes.io/projected/261e5bc8-bfe5-4036-8599-19cd5c173b63-kube-api-access-964zz\") pod \"auto-csr-approver-29533964-rrcl8\" (UID: \"261e5bc8-bfe5-4036-8599-19cd5c173b63\") " pod="openshift-infra/auto-csr-approver-29533964-rrcl8" Feb 25 16:44:00 crc kubenswrapper[4937]: I0225 16:44:00.364686 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-964zz\" (UniqueName: \"kubernetes.io/projected/261e5bc8-bfe5-4036-8599-19cd5c173b63-kube-api-access-964zz\") pod \"auto-csr-approver-29533964-rrcl8\" (UID: \"261e5bc8-bfe5-4036-8599-19cd5c173b63\") " pod="openshift-infra/auto-csr-approver-29533964-rrcl8" Feb 25 16:44:00 crc kubenswrapper[4937]: I0225 16:44:00.465528 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533964-rrcl8" Feb 25 16:44:00 crc kubenswrapper[4937]: I0225 16:44:00.913230 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533964-rrcl8"] Feb 25 16:44:01 crc kubenswrapper[4937]: I0225 16:44:01.515689 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533964-rrcl8" event={"ID":"261e5bc8-bfe5-4036-8599-19cd5c173b63","Type":"ContainerStarted","Data":"6a8b0814a799dfb3b3bf45dfa17c361c702f3778076b0c8be4ebeaaafe3afddb"} Feb 25 16:44:02 crc kubenswrapper[4937]: I0225 16:44:02.535038 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533964-rrcl8" event={"ID":"261e5bc8-bfe5-4036-8599-19cd5c173b63","Type":"ContainerStarted","Data":"5355803bb7fbcf681f764ce9972c98fb75aeef53730dbeb246a47805a76ab3b3"} Feb 25 16:44:02 crc kubenswrapper[4937]: I0225 16:44:02.562540 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533964-rrcl8" podStartSLOduration=1.32800578 podStartE2EDuration="2.56250796s" podCreationTimestamp="2026-02-25 16:44:00 +0000 UTC" firstStartedPulling="2026-02-25 16:44:00.911014746 +0000 UTC m=+3491.924406636" lastFinishedPulling="2026-02-25 16:44:02.145516926 +0000 UTC m=+3493.158908816" observedRunningTime="2026-02-25 16:44:02.554713495 +0000 UTC m=+3493.568105395" watchObservedRunningTime="2026-02-25 16:44:02.56250796 +0000 UTC m=+3493.575899850" Feb 25 16:44:03 crc kubenswrapper[4937]: I0225 16:44:03.547813 4937 generic.go:334] "Generic (PLEG): container finished" podID="261e5bc8-bfe5-4036-8599-19cd5c173b63" containerID="5355803bb7fbcf681f764ce9972c98fb75aeef53730dbeb246a47805a76ab3b3" exitCode=0 Feb 25 16:44:03 crc kubenswrapper[4937]: I0225 16:44:03.547908 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533964-rrcl8" event={"ID":"261e5bc8-bfe5-4036-8599-19cd5c173b63","Type":"ContainerDied","Data":"5355803bb7fbcf681f764ce9972c98fb75aeef53730dbeb246a47805a76ab3b3"} Feb 25 16:44:05 crc kubenswrapper[4937]: I0225 16:44:05.231431 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533964-rrcl8" Feb 25 16:44:05 crc kubenswrapper[4937]: I0225 16:44:05.355581 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-964zz\" (UniqueName: \"kubernetes.io/projected/261e5bc8-bfe5-4036-8599-19cd5c173b63-kube-api-access-964zz\") pod \"261e5bc8-bfe5-4036-8599-19cd5c173b63\" (UID: \"261e5bc8-bfe5-4036-8599-19cd5c173b63\") " Feb 25 16:44:05 crc kubenswrapper[4937]: I0225 16:44:05.362636 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/261e5bc8-bfe5-4036-8599-19cd5c173b63-kube-api-access-964zz" (OuterVolumeSpecName: "kube-api-access-964zz") pod "261e5bc8-bfe5-4036-8599-19cd5c173b63" (UID: "261e5bc8-bfe5-4036-8599-19cd5c173b63"). InnerVolumeSpecName "kube-api-access-964zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:44:05 crc kubenswrapper[4937]: I0225 16:44:05.458403 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-964zz\" (UniqueName: \"kubernetes.io/projected/261e5bc8-bfe5-4036-8599-19cd5c173b63-kube-api-access-964zz\") on node \"crc\" DevicePath \"\"" Feb 25 16:44:05 crc kubenswrapper[4937]: I0225 16:44:05.572066 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533964-rrcl8" event={"ID":"261e5bc8-bfe5-4036-8599-19cd5c173b63","Type":"ContainerDied","Data":"6a8b0814a799dfb3b3bf45dfa17c361c702f3778076b0c8be4ebeaaafe3afddb"} Feb 25 16:44:05 crc kubenswrapper[4937]: I0225 16:44:05.572111 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a8b0814a799dfb3b3bf45dfa17c361c702f3778076b0c8be4ebeaaafe3afddb" Feb 25 16:44:05 crc kubenswrapper[4937]: I0225 16:44:05.572126 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533964-rrcl8" Feb 25 16:44:05 crc kubenswrapper[4937]: I0225 16:44:05.621620 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533958-rfpf9"] Feb 25 16:44:05 crc kubenswrapper[4937]: I0225 16:44:05.630232 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533958-rfpf9"] Feb 25 16:44:07 crc kubenswrapper[4937]: I0225 16:44:07.379138 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dc53569-34e0-4131-9a94-5f40d0f4f247" path="/var/lib/kubelet/pods/5dc53569-34e0-4131-9a94-5f40d0f4f247/volumes" Feb 25 16:44:08 crc kubenswrapper[4937]: I0225 16:44:08.397567 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jvrhp/must-gather-n2crw"] Feb 25 16:44:08 crc kubenswrapper[4937]: E0225 16:44:08.398448 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261e5bc8-bfe5-4036-8599-19cd5c173b63" containerName="oc" Feb 25 16:44:08 crc kubenswrapper[4937]: I0225 16:44:08.398463 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="261e5bc8-bfe5-4036-8599-19cd5c173b63" containerName="oc" Feb 25 16:44:08 crc kubenswrapper[4937]: I0225 16:44:08.398777 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="261e5bc8-bfe5-4036-8599-19cd5c173b63" containerName="oc" Feb 25 16:44:08 crc kubenswrapper[4937]: I0225 16:44:08.400302 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/must-gather-n2crw" Feb 25 16:44:08 crc kubenswrapper[4937]: I0225 16:44:08.402857 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jvrhp"/"openshift-service-ca.crt" Feb 25 16:44:08 crc kubenswrapper[4937]: I0225 16:44:08.402980 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jvrhp"/"kube-root-ca.crt" Feb 25 16:44:08 crc kubenswrapper[4937]: I0225 16:44:08.409840 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jvrhp/must-gather-n2crw"] Feb 25 16:44:08 crc kubenswrapper[4937]: I0225 16:44:08.516126 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e-must-gather-output\") pod \"must-gather-n2crw\" (UID: \"4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e\") " pod="openshift-must-gather-jvrhp/must-gather-n2crw" Feb 25 16:44:08 crc kubenswrapper[4937]: I0225 16:44:08.516162 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnfq9\" (UniqueName: \"kubernetes.io/projected/4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e-kube-api-access-tnfq9\") pod \"must-gather-n2crw\" (UID: \"4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e\") " pod="openshift-must-gather-jvrhp/must-gather-n2crw" Feb 25 16:44:08 crc kubenswrapper[4937]: I0225 16:44:08.618043 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e-must-gather-output\") pod \"must-gather-n2crw\" (UID: \"4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e\") " pod="openshift-must-gather-jvrhp/must-gather-n2crw" Feb 25 16:44:08 crc kubenswrapper[4937]: I0225 16:44:08.618363 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnfq9\" (UniqueName: \"kubernetes.io/projected/4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e-kube-api-access-tnfq9\") pod \"must-gather-n2crw\" (UID: \"4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e\") " pod="openshift-must-gather-jvrhp/must-gather-n2crw" Feb 25 16:44:08 crc kubenswrapper[4937]: I0225 16:44:08.618536 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e-must-gather-output\") pod \"must-gather-n2crw\" (UID: \"4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e\") " pod="openshift-must-gather-jvrhp/must-gather-n2crw" Feb 25 16:44:08 crc kubenswrapper[4937]: I0225 16:44:08.642500 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnfq9\" (UniqueName: \"kubernetes.io/projected/4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e-kube-api-access-tnfq9\") pod \"must-gather-n2crw\" (UID: \"4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e\") " pod="openshift-must-gather-jvrhp/must-gather-n2crw" Feb 25 16:44:08 crc kubenswrapper[4937]: I0225 16:44:08.751038 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/must-gather-n2crw" Feb 25 16:44:09 crc kubenswrapper[4937]: I0225 16:44:09.267208 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jvrhp/must-gather-n2crw"] Feb 25 16:44:09 crc kubenswrapper[4937]: I0225 16:44:09.271431 4937 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 16:44:09 crc kubenswrapper[4937]: I0225 16:44:09.611278 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jvrhp/must-gather-n2crw" event={"ID":"4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e","Type":"ContainerStarted","Data":"6160001d2a2d08067180af9d904c045fa8f9270253db29b206e9714ed004cb72"} Feb 25 16:44:18 crc kubenswrapper[4937]: I0225 16:44:18.734310 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jvrhp/must-gather-n2crw" event={"ID":"4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e","Type":"ContainerStarted","Data":"109a3de283c41cbb01b1aaadfaecfaa86d3ad9c4f7e00f1660b74dedaa323e5c"} Feb 25 16:44:19 crc kubenswrapper[4937]: I0225 16:44:19.746650 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jvrhp/must-gather-n2crw" event={"ID":"4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e","Type":"ContainerStarted","Data":"f59105570b7f271eb4f73dba9a935431309fb61003e7767267e2f7e144dd5b1f"} Feb 25 16:44:19 crc kubenswrapper[4937]: I0225 16:44:19.766234 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jvrhp/must-gather-n2crw" podStartSLOduration=2.756787797 podStartE2EDuration="11.766215203s" podCreationTimestamp="2026-02-25 16:44:08 +0000 UTC" firstStartedPulling="2026-02-25 16:44:09.271342873 +0000 UTC m=+3500.284734763" lastFinishedPulling="2026-02-25 16:44:18.280770279 +0000 UTC m=+3509.294162169" observedRunningTime="2026-02-25 16:44:19.763611338 +0000 UTC m=+3510.777003248" watchObservedRunningTime="2026-02-25 16:44:19.766215203 +0000 UTC m=+3510.779607093" Feb 25 16:44:22 crc kubenswrapper[4937]: I0225 16:44:22.769136 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jvrhp/crc-debug-c5vq5"] Feb 25 16:44:22 crc kubenswrapper[4937]: I0225 16:44:22.772143 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/crc-debug-c5vq5" Feb 25 16:44:22 crc kubenswrapper[4937]: I0225 16:44:22.774448 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jvrhp"/"default-dockercfg-96pms" Feb 25 16:44:22 crc kubenswrapper[4937]: I0225 16:44:22.935568 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f27af1e-6a33-46f8-bca7-cae305e3e1e3-host\") pod \"crc-debug-c5vq5\" (UID: \"4f27af1e-6a33-46f8-bca7-cae305e3e1e3\") " pod="openshift-must-gather-jvrhp/crc-debug-c5vq5" Feb 25 16:44:22 crc kubenswrapper[4937]: I0225 16:44:22.935707 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mmfm\" (UniqueName: \"kubernetes.io/projected/4f27af1e-6a33-46f8-bca7-cae305e3e1e3-kube-api-access-6mmfm\") pod \"crc-debug-c5vq5\" (UID: \"4f27af1e-6a33-46f8-bca7-cae305e3e1e3\") " pod="openshift-must-gather-jvrhp/crc-debug-c5vq5" Feb 25 16:44:23 crc kubenswrapper[4937]: I0225 16:44:23.038947 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f27af1e-6a33-46f8-bca7-cae305e3e1e3-host\") pod \"crc-debug-c5vq5\" (UID: \"4f27af1e-6a33-46f8-bca7-cae305e3e1e3\") " pod="openshift-must-gather-jvrhp/crc-debug-c5vq5" Feb 25 16:44:23 crc kubenswrapper[4937]: I0225 16:44:23.039621 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mmfm\" (UniqueName: \"kubernetes.io/projected/4f27af1e-6a33-46f8-bca7-cae305e3e1e3-kube-api-access-6mmfm\") pod \"crc-debug-c5vq5\" (UID: \"4f27af1e-6a33-46f8-bca7-cae305e3e1e3\") " pod="openshift-must-gather-jvrhp/crc-debug-c5vq5" Feb 25 16:44:23 crc kubenswrapper[4937]: I0225 16:44:23.039134 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f27af1e-6a33-46f8-bca7-cae305e3e1e3-host\") pod \"crc-debug-c5vq5\" (UID: \"4f27af1e-6a33-46f8-bca7-cae305e3e1e3\") " pod="openshift-must-gather-jvrhp/crc-debug-c5vq5" Feb 25 16:44:23 crc kubenswrapper[4937]: I0225 16:44:23.064821 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mmfm\" (UniqueName: \"kubernetes.io/projected/4f27af1e-6a33-46f8-bca7-cae305e3e1e3-kube-api-access-6mmfm\") pod \"crc-debug-c5vq5\" (UID: \"4f27af1e-6a33-46f8-bca7-cae305e3e1e3\") " pod="openshift-must-gather-jvrhp/crc-debug-c5vq5" Feb 25 16:44:23 crc kubenswrapper[4937]: I0225 16:44:23.100668 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/crc-debug-c5vq5" Feb 25 16:44:23 crc kubenswrapper[4937]: W0225 16:44:23.170429 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f27af1e_6a33_46f8_bca7_cae305e3e1e3.slice/crio-044812c809392865c4d4b3bceedbc07c64a791119e7da8ecb5878f21b8e8ca34 WatchSource:0}: Error finding container 044812c809392865c4d4b3bceedbc07c64a791119e7da8ecb5878f21b8e8ca34: Status 404 returned error can't find the container with id 044812c809392865c4d4b3bceedbc07c64a791119e7da8ecb5878f21b8e8ca34 Feb 25 16:44:23 crc kubenswrapper[4937]: I0225 16:44:23.795323 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jvrhp/crc-debug-c5vq5" event={"ID":"4f27af1e-6a33-46f8-bca7-cae305e3e1e3","Type":"ContainerStarted","Data":"044812c809392865c4d4b3bceedbc07c64a791119e7da8ecb5878f21b8e8ca34"} Feb 25 16:44:37 crc kubenswrapper[4937]: I0225 16:44:37.943171 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jvrhp/crc-debug-c5vq5" event={"ID":"4f27af1e-6a33-46f8-bca7-cae305e3e1e3","Type":"ContainerStarted","Data":"bb2e9d2b9c738a579061b25e47064c6a1fd4da7c5e0c1ab06634ed310ab18048"} Feb 25 16:44:37 crc kubenswrapper[4937]: I0225 16:44:37.968535 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jvrhp/crc-debug-c5vq5" podStartSLOduration=2.094859431 podStartE2EDuration="15.968515739s" podCreationTimestamp="2026-02-25 16:44:22 +0000 UTC" firstStartedPulling="2026-02-25 16:44:23.173311819 +0000 UTC m=+3514.186703709" lastFinishedPulling="2026-02-25 16:44:37.046968127 +0000 UTC m=+3528.060360017" observedRunningTime="2026-02-25 16:44:37.964128629 +0000 UTC m=+3528.977520529" watchObservedRunningTime="2026-02-25 16:44:37.968515739 +0000 UTC m=+3528.981907629" Feb 25 16:44:41 crc kubenswrapper[4937]: I0225 16:44:41.494376 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:44:41 crc kubenswrapper[4937]: I0225 16:44:41.494989 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:44:43 crc kubenswrapper[4937]: I0225 16:44:43.300002 4937 scope.go:117] "RemoveContainer" containerID="c165af6627b531334b08b189d2ece7ac3f9f7afd546b1660022f3f11a71e108e" Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.161604 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9"] Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.163704 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.173617 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.173686 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.175407 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9"] Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.200274 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfvjt\" (UniqueName: \"kubernetes.io/projected/f4890cd3-0e53-48ad-b528-8320f44c6a83-kube-api-access-wfvjt\") pod \"collect-profiles-29533965-sstj9\" (UID: \"f4890cd3-0e53-48ad-b528-8320f44c6a83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.200644 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4890cd3-0e53-48ad-b528-8320f44c6a83-secret-volume\") pod \"collect-profiles-29533965-sstj9\" (UID: \"f4890cd3-0e53-48ad-b528-8320f44c6a83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.200672 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4890cd3-0e53-48ad-b528-8320f44c6a83-config-volume\") pod \"collect-profiles-29533965-sstj9\" (UID: \"f4890cd3-0e53-48ad-b528-8320f44c6a83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.302240 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfvjt\" (UniqueName: \"kubernetes.io/projected/f4890cd3-0e53-48ad-b528-8320f44c6a83-kube-api-access-wfvjt\") pod \"collect-profiles-29533965-sstj9\" (UID: \"f4890cd3-0e53-48ad-b528-8320f44c6a83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.302297 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4890cd3-0e53-48ad-b528-8320f44c6a83-secret-volume\") pod \"collect-profiles-29533965-sstj9\" (UID: \"f4890cd3-0e53-48ad-b528-8320f44c6a83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.302323 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4890cd3-0e53-48ad-b528-8320f44c6a83-config-volume\") pod \"collect-profiles-29533965-sstj9\" (UID: \"f4890cd3-0e53-48ad-b528-8320f44c6a83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.303171 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4890cd3-0e53-48ad-b528-8320f44c6a83-config-volume\") pod \"collect-profiles-29533965-sstj9\" (UID: \"f4890cd3-0e53-48ad-b528-8320f44c6a83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.313304 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4890cd3-0e53-48ad-b528-8320f44c6a83-secret-volume\") pod \"collect-profiles-29533965-sstj9\" (UID: \"f4890cd3-0e53-48ad-b528-8320f44c6a83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.318972 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfvjt\" (UniqueName: \"kubernetes.io/projected/f4890cd3-0e53-48ad-b528-8320f44c6a83-kube-api-access-wfvjt\") pod \"collect-profiles-29533965-sstj9\" (UID: \"f4890cd3-0e53-48ad-b528-8320f44c6a83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" Feb 25 16:45:00 crc kubenswrapper[4937]: I0225 16:45:00.547588 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" Feb 25 16:45:01 crc kubenswrapper[4937]: I0225 16:45:01.235455 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9"] Feb 25 16:45:02 crc kubenswrapper[4937]: I0225 16:45:02.171709 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" event={"ID":"f4890cd3-0e53-48ad-b528-8320f44c6a83","Type":"ContainerStarted","Data":"1b452e1de809a59f0a6182c9e2cde6cc8c39b36459c68d54a02bd839f22ded81"} Feb 25 16:45:02 crc kubenswrapper[4937]: I0225 16:45:02.172387 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" event={"ID":"f4890cd3-0e53-48ad-b528-8320f44c6a83","Type":"ContainerStarted","Data":"63138bd0183b0d68124221e24a11ee243d9a6ec51c0dae0303d4f31f17ae8f3f"} Feb 25 16:45:03 crc kubenswrapper[4937]: I0225 16:45:03.182693 4937 generic.go:334] "Generic (PLEG): container finished" podID="f4890cd3-0e53-48ad-b528-8320f44c6a83" containerID="1b452e1de809a59f0a6182c9e2cde6cc8c39b36459c68d54a02bd839f22ded81" exitCode=0 Feb 25 16:45:03 crc kubenswrapper[4937]: I0225 16:45:03.182744 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" event={"ID":"f4890cd3-0e53-48ad-b528-8320f44c6a83","Type":"ContainerDied","Data":"1b452e1de809a59f0a6182c9e2cde6cc8c39b36459c68d54a02bd839f22ded81"} Feb 25 16:45:03 crc kubenswrapper[4937]: I0225 16:45:03.881756 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" Feb 25 16:45:03 crc kubenswrapper[4937]: I0225 16:45:03.992198 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4890cd3-0e53-48ad-b528-8320f44c6a83-config-volume\") pod \"f4890cd3-0e53-48ad-b528-8320f44c6a83\" (UID: \"f4890cd3-0e53-48ad-b528-8320f44c6a83\") " Feb 25 16:45:03 crc kubenswrapper[4937]: I0225 16:45:03.992299 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfvjt\" (UniqueName: \"kubernetes.io/projected/f4890cd3-0e53-48ad-b528-8320f44c6a83-kube-api-access-wfvjt\") pod \"f4890cd3-0e53-48ad-b528-8320f44c6a83\" (UID: \"f4890cd3-0e53-48ad-b528-8320f44c6a83\") " Feb 25 16:45:03 crc kubenswrapper[4937]: I0225 16:45:03.992403 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4890cd3-0e53-48ad-b528-8320f44c6a83-secret-volume\") pod \"f4890cd3-0e53-48ad-b528-8320f44c6a83\" (UID: \"f4890cd3-0e53-48ad-b528-8320f44c6a83\") " Feb 25 16:45:03 crc kubenswrapper[4937]: I0225 16:45:03.993085 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4890cd3-0e53-48ad-b528-8320f44c6a83-config-volume" (OuterVolumeSpecName: "config-volume") pod "f4890cd3-0e53-48ad-b528-8320f44c6a83" (UID: "f4890cd3-0e53-48ad-b528-8320f44c6a83"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 16:45:04 crc kubenswrapper[4937]: I0225 16:45:04.001625 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4890cd3-0e53-48ad-b528-8320f44c6a83-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f4890cd3-0e53-48ad-b528-8320f44c6a83" (UID: "f4890cd3-0e53-48ad-b528-8320f44c6a83"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 16:45:04 crc kubenswrapper[4937]: I0225 16:45:04.013194 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4890cd3-0e53-48ad-b528-8320f44c6a83-kube-api-access-wfvjt" (OuterVolumeSpecName: "kube-api-access-wfvjt") pod "f4890cd3-0e53-48ad-b528-8320f44c6a83" (UID: "f4890cd3-0e53-48ad-b528-8320f44c6a83"). InnerVolumeSpecName "kube-api-access-wfvjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:45:04 crc kubenswrapper[4937]: I0225 16:45:04.094908 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfvjt\" (UniqueName: \"kubernetes.io/projected/f4890cd3-0e53-48ad-b528-8320f44c6a83-kube-api-access-wfvjt\") on node \"crc\" DevicePath \"\"" Feb 25 16:45:04 crc kubenswrapper[4937]: I0225 16:45:04.094942 4937 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4890cd3-0e53-48ad-b528-8320f44c6a83-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 16:45:04 crc kubenswrapper[4937]: I0225 16:45:04.094953 4937 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4890cd3-0e53-48ad-b528-8320f44c6a83-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 16:45:04 crc kubenswrapper[4937]: I0225 16:45:04.193446 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" event={"ID":"f4890cd3-0e53-48ad-b528-8320f44c6a83","Type":"ContainerDied","Data":"63138bd0183b0d68124221e24a11ee243d9a6ec51c0dae0303d4f31f17ae8f3f"} Feb 25 16:45:04 crc kubenswrapper[4937]: I0225 16:45:04.193503 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63138bd0183b0d68124221e24a11ee243d9a6ec51c0dae0303d4f31f17ae8f3f" Feb 25 16:45:04 crc kubenswrapper[4937]: I0225 16:45:04.193505 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533965-sstj9" Feb 25 16:45:04 crc kubenswrapper[4937]: I0225 16:45:04.956749 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw"] Feb 25 16:45:04 crc kubenswrapper[4937]: I0225 16:45:04.965722 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533920-ksbgw"] Feb 25 16:45:05 crc kubenswrapper[4937]: I0225 16:45:05.381537 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f" path="/var/lib/kubelet/pods/250e9ce0-aa13-4ba3-ab3f-b6fd566d7f9f/volumes" Feb 25 16:45:11 crc kubenswrapper[4937]: I0225 16:45:11.494296 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:45:11 crc kubenswrapper[4937]: I0225 16:45:11.494789 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:45:39 crc kubenswrapper[4937]: I0225 16:45:39.535954 4937 generic.go:334] "Generic (PLEG): container finished" podID="4f27af1e-6a33-46f8-bca7-cae305e3e1e3" containerID="bb2e9d2b9c738a579061b25e47064c6a1fd4da7c5e0c1ab06634ed310ab18048" exitCode=0 Feb 25 16:45:39 crc kubenswrapper[4937]: I0225 16:45:39.536025 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jvrhp/crc-debug-c5vq5" event={"ID":"4f27af1e-6a33-46f8-bca7-cae305e3e1e3","Type":"ContainerDied","Data":"bb2e9d2b9c738a579061b25e47064c6a1fd4da7c5e0c1ab06634ed310ab18048"} Feb 25 16:45:40 crc kubenswrapper[4937]: I0225 16:45:40.639961 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/crc-debug-c5vq5" Feb 25 16:45:40 crc kubenswrapper[4937]: I0225 16:45:40.684106 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jvrhp/crc-debug-c5vq5"] Feb 25 16:45:40 crc kubenswrapper[4937]: I0225 16:45:40.694528 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jvrhp/crc-debug-c5vq5"] Feb 25 16:45:40 crc kubenswrapper[4937]: I0225 16:45:40.760759 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mmfm\" (UniqueName: \"kubernetes.io/projected/4f27af1e-6a33-46f8-bca7-cae305e3e1e3-kube-api-access-6mmfm\") pod \"4f27af1e-6a33-46f8-bca7-cae305e3e1e3\" (UID: \"4f27af1e-6a33-46f8-bca7-cae305e3e1e3\") " Feb 25 16:45:40 crc kubenswrapper[4937]: I0225 16:45:40.760867 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f27af1e-6a33-46f8-bca7-cae305e3e1e3-host\") pod \"4f27af1e-6a33-46f8-bca7-cae305e3e1e3\" (UID: \"4f27af1e-6a33-46f8-bca7-cae305e3e1e3\") " Feb 25 16:45:40 crc kubenswrapper[4937]: I0225 16:45:40.761003 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f27af1e-6a33-46f8-bca7-cae305e3e1e3-host" (OuterVolumeSpecName: "host") pod "4f27af1e-6a33-46f8-bca7-cae305e3e1e3" (UID: "4f27af1e-6a33-46f8-bca7-cae305e3e1e3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:45:40 crc kubenswrapper[4937]: I0225 16:45:40.761468 4937 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f27af1e-6a33-46f8-bca7-cae305e3e1e3-host\") on node \"crc\" DevicePath \"\"" Feb 25 16:45:40 crc kubenswrapper[4937]: I0225 16:45:40.766269 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f27af1e-6a33-46f8-bca7-cae305e3e1e3-kube-api-access-6mmfm" (OuterVolumeSpecName: "kube-api-access-6mmfm") pod "4f27af1e-6a33-46f8-bca7-cae305e3e1e3" (UID: "4f27af1e-6a33-46f8-bca7-cae305e3e1e3"). InnerVolumeSpecName "kube-api-access-6mmfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:45:40 crc kubenswrapper[4937]: I0225 16:45:40.862934 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mmfm\" (UniqueName: \"kubernetes.io/projected/4f27af1e-6a33-46f8-bca7-cae305e3e1e3-kube-api-access-6mmfm\") on node \"crc\" DevicePath \"\"" Feb 25 16:45:41 crc kubenswrapper[4937]: I0225 16:45:41.387137 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f27af1e-6a33-46f8-bca7-cae305e3e1e3" path="/var/lib/kubelet/pods/4f27af1e-6a33-46f8-bca7-cae305e3e1e3/volumes" Feb 25 16:45:41 crc kubenswrapper[4937]: I0225 16:45:41.502034 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:45:41 crc kubenswrapper[4937]: I0225 16:45:41.502434 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:45:41 crc kubenswrapper[4937]: I0225 16:45:41.502522 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 16:45:41 crc kubenswrapper[4937]: I0225 16:45:41.505683 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c523637845c15bc142c970843436a3634d3a2d1727208c0649d730f119f41f73"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 16:45:41 crc kubenswrapper[4937]: I0225 16:45:41.505751 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://c523637845c15bc142c970843436a3634d3a2d1727208c0649d730f119f41f73" gracePeriod=600 Feb 25 16:45:41 crc kubenswrapper[4937]: I0225 16:45:41.571784 4937 scope.go:117] "RemoveContainer" containerID="bb2e9d2b9c738a579061b25e47064c6a1fd4da7c5e0c1ab06634ed310ab18048" Feb 25 16:45:41 crc kubenswrapper[4937]: I0225 16:45:41.571955 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/crc-debug-c5vq5" Feb 25 16:45:41 crc kubenswrapper[4937]: E0225 16:45:41.911019 4937 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f826096_fb93_42fe_a779_9afe1d36f2d4.slice/crio-conmon-c523637845c15bc142c970843436a3634d3a2d1727208c0649d730f119f41f73.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f826096_fb93_42fe_a779_9afe1d36f2d4.slice/crio-c523637845c15bc142c970843436a3634d3a2d1727208c0649d730f119f41f73.scope\": RecentStats: unable to find data in memory cache]" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.024112 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jvrhp/crc-debug-7ztp5"] Feb 25 16:45:42 crc kubenswrapper[4937]: E0225 16:45:42.024924 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4890cd3-0e53-48ad-b528-8320f44c6a83" containerName="collect-profiles" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.024944 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4890cd3-0e53-48ad-b528-8320f44c6a83" containerName="collect-profiles" Feb 25 16:45:42 crc kubenswrapper[4937]: E0225 16:45:42.024958 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f27af1e-6a33-46f8-bca7-cae305e3e1e3" containerName="container-00" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.024964 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f27af1e-6a33-46f8-bca7-cae305e3e1e3" containerName="container-00" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.025192 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4890cd3-0e53-48ad-b528-8320f44c6a83" containerName="collect-profiles" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.025243 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f27af1e-6a33-46f8-bca7-cae305e3e1e3" containerName="container-00" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.025917 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/crc-debug-7ztp5" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.031071 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jvrhp"/"default-dockercfg-96pms" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.196426 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxn4h\" (UniqueName: \"kubernetes.io/projected/89096233-6bda-4b97-be42-6b3f6c421466-kube-api-access-cxn4h\") pod \"crc-debug-7ztp5\" (UID: \"89096233-6bda-4b97-be42-6b3f6c421466\") " pod="openshift-must-gather-jvrhp/crc-debug-7ztp5" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.196899 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89096233-6bda-4b97-be42-6b3f6c421466-host\") pod \"crc-debug-7ztp5\" (UID: \"89096233-6bda-4b97-be42-6b3f6c421466\") " pod="openshift-must-gather-jvrhp/crc-debug-7ztp5" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.299806 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxn4h\" (UniqueName: \"kubernetes.io/projected/89096233-6bda-4b97-be42-6b3f6c421466-kube-api-access-cxn4h\") pod \"crc-debug-7ztp5\" (UID: \"89096233-6bda-4b97-be42-6b3f6c421466\") " pod="openshift-must-gather-jvrhp/crc-debug-7ztp5" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.300140 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89096233-6bda-4b97-be42-6b3f6c421466-host\") pod \"crc-debug-7ztp5\" (UID: \"89096233-6bda-4b97-be42-6b3f6c421466\") " pod="openshift-must-gather-jvrhp/crc-debug-7ztp5" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.300525 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89096233-6bda-4b97-be42-6b3f6c421466-host\") pod \"crc-debug-7ztp5\" (UID: \"89096233-6bda-4b97-be42-6b3f6c421466\") " pod="openshift-must-gather-jvrhp/crc-debug-7ztp5" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.332787 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxn4h\" (UniqueName: \"kubernetes.io/projected/89096233-6bda-4b97-be42-6b3f6c421466-kube-api-access-cxn4h\") pod \"crc-debug-7ztp5\" (UID: \"89096233-6bda-4b97-be42-6b3f6c421466\") " pod="openshift-must-gather-jvrhp/crc-debug-7ztp5" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.344532 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/crc-debug-7ztp5" Feb 25 16:45:42 crc kubenswrapper[4937]: W0225 16:45:42.373457 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89096233_6bda_4b97_be42_6b3f6c421466.slice/crio-25be3d2eed6e8c9b6f3a72f78990328e27ed8a839489ae70c94301259dc5b20e WatchSource:0}: Error finding container 25be3d2eed6e8c9b6f3a72f78990328e27ed8a839489ae70c94301259dc5b20e: Status 404 returned error can't find the container with id 25be3d2eed6e8c9b6f3a72f78990328e27ed8a839489ae70c94301259dc5b20e Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.586283 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="c523637845c15bc142c970843436a3634d3a2d1727208c0649d730f119f41f73" exitCode=0 Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.586349 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"c523637845c15bc142c970843436a3634d3a2d1727208c0649d730f119f41f73"} Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.586789 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd"} Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.586828 4937 scope.go:117] "RemoveContainer" containerID="6164c175e2361d3582417e3f3b2965bb7a53f63a499eaff0063f2a88224ac82d" Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.590328 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jvrhp/crc-debug-7ztp5" event={"ID":"89096233-6bda-4b97-be42-6b3f6c421466","Type":"ContainerStarted","Data":"8fd7241ab4f28579b7fdd9ee032ef71a10c85d491c25d2d9560ad7f682167bd1"} Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.590404 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jvrhp/crc-debug-7ztp5" event={"ID":"89096233-6bda-4b97-be42-6b3f6c421466","Type":"ContainerStarted","Data":"25be3d2eed6e8c9b6f3a72f78990328e27ed8a839489ae70c94301259dc5b20e"} Feb 25 16:45:42 crc kubenswrapper[4937]: I0225 16:45:42.630358 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jvrhp/crc-debug-7ztp5" podStartSLOduration=0.630336595 podStartE2EDuration="630.336595ms" podCreationTimestamp="2026-02-25 16:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:45:42.617441442 +0000 UTC m=+3593.630833382" watchObservedRunningTime="2026-02-25 16:45:42.630336595 +0000 UTC m=+3593.643728475" Feb 25 16:45:43 crc kubenswrapper[4937]: I0225 16:45:43.403805 4937 scope.go:117] "RemoveContainer" containerID="98afa0be2a7238676648049bfd8bad309c32dec99502b3a71d7d6477a73bc0ea" Feb 25 16:45:43 crc kubenswrapper[4937]: I0225 16:45:43.604575 4937 generic.go:334] "Generic (PLEG): container finished" podID="89096233-6bda-4b97-be42-6b3f6c421466" containerID="8fd7241ab4f28579b7fdd9ee032ef71a10c85d491c25d2d9560ad7f682167bd1" exitCode=0 Feb 25 16:45:43 crc kubenswrapper[4937]: I0225 16:45:43.604702 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jvrhp/crc-debug-7ztp5" event={"ID":"89096233-6bda-4b97-be42-6b3f6c421466","Type":"ContainerDied","Data":"8fd7241ab4f28579b7fdd9ee032ef71a10c85d491c25d2d9560ad7f682167bd1"} Feb 25 16:45:44 crc kubenswrapper[4937]: I0225 16:45:44.732380 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/crc-debug-7ztp5" Feb 25 16:45:44 crc kubenswrapper[4937]: I0225 16:45:44.780261 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jvrhp/crc-debug-7ztp5"] Feb 25 16:45:44 crc kubenswrapper[4937]: I0225 16:45:44.791317 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jvrhp/crc-debug-7ztp5"] Feb 25 16:45:44 crc kubenswrapper[4937]: I0225 16:45:44.873937 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89096233-6bda-4b97-be42-6b3f6c421466-host\") pod \"89096233-6bda-4b97-be42-6b3f6c421466\" (UID: \"89096233-6bda-4b97-be42-6b3f6c421466\") " Feb 25 16:45:44 crc kubenswrapper[4937]: I0225 16:45:44.874281 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89096233-6bda-4b97-be42-6b3f6c421466-host" (OuterVolumeSpecName: "host") pod "89096233-6bda-4b97-be42-6b3f6c421466" (UID: "89096233-6bda-4b97-be42-6b3f6c421466"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:45:44 crc kubenswrapper[4937]: I0225 16:45:44.874474 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxn4h\" (UniqueName: \"kubernetes.io/projected/89096233-6bda-4b97-be42-6b3f6c421466-kube-api-access-cxn4h\") pod \"89096233-6bda-4b97-be42-6b3f6c421466\" (UID: \"89096233-6bda-4b97-be42-6b3f6c421466\") " Feb 25 16:45:44 crc kubenswrapper[4937]: I0225 16:45:44.876068 4937 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89096233-6bda-4b97-be42-6b3f6c421466-host\") on node \"crc\" DevicePath \"\"" Feb 25 16:45:44 crc kubenswrapper[4937]: I0225 16:45:44.890543 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89096233-6bda-4b97-be42-6b3f6c421466-kube-api-access-cxn4h" (OuterVolumeSpecName: "kube-api-access-cxn4h") pod "89096233-6bda-4b97-be42-6b3f6c421466" (UID: "89096233-6bda-4b97-be42-6b3f6c421466"). InnerVolumeSpecName "kube-api-access-cxn4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:45:44 crc kubenswrapper[4937]: I0225 16:45:44.978183 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxn4h\" (UniqueName: \"kubernetes.io/projected/89096233-6bda-4b97-be42-6b3f6c421466-kube-api-access-cxn4h\") on node \"crc\" DevicePath \"\"" Feb 25 16:45:45 crc kubenswrapper[4937]: I0225 16:45:45.379982 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89096233-6bda-4b97-be42-6b3f6c421466" path="/var/lib/kubelet/pods/89096233-6bda-4b97-be42-6b3f6c421466/volumes" Feb 25 16:45:45 crc kubenswrapper[4937]: I0225 16:45:45.623928 4937 scope.go:117] "RemoveContainer" containerID="8fd7241ab4f28579b7fdd9ee032ef71a10c85d491c25d2d9560ad7f682167bd1" Feb 25 16:45:45 crc kubenswrapper[4937]: I0225 16:45:45.624034 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/crc-debug-7ztp5" Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.004792 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jvrhp/crc-debug-9frs2"] Feb 25 16:45:46 crc kubenswrapper[4937]: E0225 16:45:46.005791 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89096233-6bda-4b97-be42-6b3f6c421466" containerName="container-00" Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.005819 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="89096233-6bda-4b97-be42-6b3f6c421466" containerName="container-00" Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.006228 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="89096233-6bda-4b97-be42-6b3f6c421466" containerName="container-00" Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.007545 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/crc-debug-9frs2" Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.010036 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jvrhp"/"default-dockercfg-96pms" Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.200773 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32bfbee4-3e2a-41cd-a3ae-28618c58637e-host\") pod \"crc-debug-9frs2\" (UID: \"32bfbee4-3e2a-41cd-a3ae-28618c58637e\") " pod="openshift-must-gather-jvrhp/crc-debug-9frs2" Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.200949 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24h9t\" (UniqueName: \"kubernetes.io/projected/32bfbee4-3e2a-41cd-a3ae-28618c58637e-kube-api-access-24h9t\") pod \"crc-debug-9frs2\" (UID: \"32bfbee4-3e2a-41cd-a3ae-28618c58637e\") " pod="openshift-must-gather-jvrhp/crc-debug-9frs2" Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.302751 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32bfbee4-3e2a-41cd-a3ae-28618c58637e-host\") pod \"crc-debug-9frs2\" (UID: \"32bfbee4-3e2a-41cd-a3ae-28618c58637e\") " pod="openshift-must-gather-jvrhp/crc-debug-9frs2" Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.302855 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24h9t\" (UniqueName: \"kubernetes.io/projected/32bfbee4-3e2a-41cd-a3ae-28618c58637e-kube-api-access-24h9t\") pod \"crc-debug-9frs2\" (UID: \"32bfbee4-3e2a-41cd-a3ae-28618c58637e\") " pod="openshift-must-gather-jvrhp/crc-debug-9frs2" Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.302949 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32bfbee4-3e2a-41cd-a3ae-28618c58637e-host\") pod \"crc-debug-9frs2\" (UID: \"32bfbee4-3e2a-41cd-a3ae-28618c58637e\") " pod="openshift-must-gather-jvrhp/crc-debug-9frs2" Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.323174 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24h9t\" (UniqueName: \"kubernetes.io/projected/32bfbee4-3e2a-41cd-a3ae-28618c58637e-kube-api-access-24h9t\") pod \"crc-debug-9frs2\" (UID: \"32bfbee4-3e2a-41cd-a3ae-28618c58637e\") " pod="openshift-must-gather-jvrhp/crc-debug-9frs2" Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.340509 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/crc-debug-9frs2" Feb 25 16:45:46 crc kubenswrapper[4937]: W0225 16:45:46.378762 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32bfbee4_3e2a_41cd_a3ae_28618c58637e.slice/crio-f7aa3bab23d3f9e6d279e5969dbc49fdbf6552ca007f06b9a27274d1fd327c99 WatchSource:0}: Error finding container f7aa3bab23d3f9e6d279e5969dbc49fdbf6552ca007f06b9a27274d1fd327c99: Status 404 returned error can't find the container with id f7aa3bab23d3f9e6d279e5969dbc49fdbf6552ca007f06b9a27274d1fd327c99 Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.637120 4937 generic.go:334] "Generic (PLEG): container finished" podID="32bfbee4-3e2a-41cd-a3ae-28618c58637e" containerID="ccfda5072df262bfc2208b077dff586ae2a4802cef82a381f097ca566dba2b26" exitCode=0 Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.637255 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jvrhp/crc-debug-9frs2" event={"ID":"32bfbee4-3e2a-41cd-a3ae-28618c58637e","Type":"ContainerDied","Data":"ccfda5072df262bfc2208b077dff586ae2a4802cef82a381f097ca566dba2b26"} Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.637527 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jvrhp/crc-debug-9frs2" event={"ID":"32bfbee4-3e2a-41cd-a3ae-28618c58637e","Type":"ContainerStarted","Data":"f7aa3bab23d3f9e6d279e5969dbc49fdbf6552ca007f06b9a27274d1fd327c99"} Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.771429 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jvrhp/crc-debug-9frs2"] Feb 25 16:45:46 crc kubenswrapper[4937]: I0225 16:45:46.804193 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jvrhp/crc-debug-9frs2"] Feb 25 16:45:47 crc kubenswrapper[4937]: I0225 16:45:47.759679 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/crc-debug-9frs2" Feb 25 16:45:47 crc kubenswrapper[4937]: I0225 16:45:47.937631 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24h9t\" (UniqueName: \"kubernetes.io/projected/32bfbee4-3e2a-41cd-a3ae-28618c58637e-kube-api-access-24h9t\") pod \"32bfbee4-3e2a-41cd-a3ae-28618c58637e\" (UID: \"32bfbee4-3e2a-41cd-a3ae-28618c58637e\") " Feb 25 16:45:47 crc kubenswrapper[4937]: I0225 16:45:47.937969 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32bfbee4-3e2a-41cd-a3ae-28618c58637e-host\") pod \"32bfbee4-3e2a-41cd-a3ae-28618c58637e\" (UID: \"32bfbee4-3e2a-41cd-a3ae-28618c58637e\") " Feb 25 16:45:47 crc kubenswrapper[4937]: I0225 16:45:47.938046 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32bfbee4-3e2a-41cd-a3ae-28618c58637e-host" (OuterVolumeSpecName: "host") pod "32bfbee4-3e2a-41cd-a3ae-28618c58637e" (UID: "32bfbee4-3e2a-41cd-a3ae-28618c58637e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:45:47 crc kubenswrapper[4937]: I0225 16:45:47.938429 4937 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32bfbee4-3e2a-41cd-a3ae-28618c58637e-host\") on node \"crc\" DevicePath \"\"" Feb 25 16:45:47 crc kubenswrapper[4937]: I0225 16:45:47.945154 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32bfbee4-3e2a-41cd-a3ae-28618c58637e-kube-api-access-24h9t" (OuterVolumeSpecName: "kube-api-access-24h9t") pod "32bfbee4-3e2a-41cd-a3ae-28618c58637e" (UID: "32bfbee4-3e2a-41cd-a3ae-28618c58637e"). InnerVolumeSpecName "kube-api-access-24h9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:45:48 crc kubenswrapper[4937]: I0225 16:45:48.040670 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24h9t\" (UniqueName: \"kubernetes.io/projected/32bfbee4-3e2a-41cd-a3ae-28618c58637e-kube-api-access-24h9t\") on node \"crc\" DevicePath \"\"" Feb 25 16:45:48 crc kubenswrapper[4937]: I0225 16:45:48.664277 4937 scope.go:117] "RemoveContainer" containerID="ccfda5072df262bfc2208b077dff586ae2a4802cef82a381f097ca566dba2b26" Feb 25 16:45:48 crc kubenswrapper[4937]: I0225 16:45:48.664403 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/crc-debug-9frs2" Feb 25 16:45:49 crc kubenswrapper[4937]: I0225 16:45:49.377271 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32bfbee4-3e2a-41cd-a3ae-28618c58637e" path="/var/lib/kubelet/pods/32bfbee4-3e2a-41cd-a3ae-28618c58637e/volumes" Feb 25 16:46:00 crc kubenswrapper[4937]: I0225 16:46:00.149660 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533966-blbs6"] Feb 25 16:46:00 crc kubenswrapper[4937]: E0225 16:46:00.150631 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32bfbee4-3e2a-41cd-a3ae-28618c58637e" containerName="container-00" Feb 25 16:46:00 crc kubenswrapper[4937]: I0225 16:46:00.150647 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="32bfbee4-3e2a-41cd-a3ae-28618c58637e" containerName="container-00" Feb 25 16:46:00 crc kubenswrapper[4937]: I0225 16:46:00.150939 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="32bfbee4-3e2a-41cd-a3ae-28618c58637e" containerName="container-00" Feb 25 16:46:00 crc kubenswrapper[4937]: I0225 16:46:00.151915 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533966-blbs6" Feb 25 16:46:00 crc kubenswrapper[4937]: I0225 16:46:00.154621 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:46:00 crc kubenswrapper[4937]: I0225 16:46:00.158879 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:46:00 crc kubenswrapper[4937]: I0225 16:46:00.158927 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:46:00 crc kubenswrapper[4937]: I0225 16:46:00.159918 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533966-blbs6"] Feb 25 16:46:00 crc kubenswrapper[4937]: I0225 16:46:00.251285 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwvgc\" (UniqueName: \"kubernetes.io/projected/9b35a656-aaf5-4dfb-97fb-4b80d530e729-kube-api-access-mwvgc\") pod \"auto-csr-approver-29533966-blbs6\" (UID: \"9b35a656-aaf5-4dfb-97fb-4b80d530e729\") " pod="openshift-infra/auto-csr-approver-29533966-blbs6" Feb 25 16:46:00 crc kubenswrapper[4937]: I0225 16:46:00.353595 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwvgc\" (UniqueName: \"kubernetes.io/projected/9b35a656-aaf5-4dfb-97fb-4b80d530e729-kube-api-access-mwvgc\") pod \"auto-csr-approver-29533966-blbs6\" (UID: \"9b35a656-aaf5-4dfb-97fb-4b80d530e729\") " pod="openshift-infra/auto-csr-approver-29533966-blbs6" Feb 25 16:46:00 crc kubenswrapper[4937]: I0225 16:46:00.381684 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwvgc\" (UniqueName: \"kubernetes.io/projected/9b35a656-aaf5-4dfb-97fb-4b80d530e729-kube-api-access-mwvgc\") pod \"auto-csr-approver-29533966-blbs6\" (UID: \"9b35a656-aaf5-4dfb-97fb-4b80d530e729\") " pod="openshift-infra/auto-csr-approver-29533966-blbs6" Feb 25 16:46:00 crc kubenswrapper[4937]: I0225 16:46:00.468166 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533966-blbs6" Feb 25 16:46:00 crc kubenswrapper[4937]: I0225 16:46:00.963413 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533966-blbs6"] Feb 25 16:46:01 crc kubenswrapper[4937]: I0225 16:46:01.789211 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533966-blbs6" event={"ID":"9b35a656-aaf5-4dfb-97fb-4b80d530e729","Type":"ContainerStarted","Data":"5b01d239b79d164082b59122dfd993d2d3836fb97a790439f1b8716b8f43d685"} Feb 25 16:46:02 crc kubenswrapper[4937]: I0225 16:46:02.804271 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533966-blbs6" event={"ID":"9b35a656-aaf5-4dfb-97fb-4b80d530e729","Type":"ContainerStarted","Data":"c664db3cb6141b0ebe690919484b0051471ebc0c271ecc246f9b2a5996858002"} Feb 25 16:46:02 crc kubenswrapper[4937]: I0225 16:46:02.853785 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533966-blbs6" podStartSLOduration=1.419090189 podStartE2EDuration="2.853765473s" podCreationTimestamp="2026-02-25 16:46:00 +0000 UTC" firstStartedPulling="2026-02-25 16:46:00.974275579 +0000 UTC m=+3611.987667469" lastFinishedPulling="2026-02-25 16:46:02.408950863 +0000 UTC m=+3613.422342753" observedRunningTime="2026-02-25 16:46:02.837967557 +0000 UTC m=+3613.851359447" watchObservedRunningTime="2026-02-25 16:46:02.853765473 +0000 UTC m=+3613.867157363" Feb 25 16:46:03 crc kubenswrapper[4937]: I0225 16:46:03.819611 4937 generic.go:334] "Generic (PLEG): container finished" podID="9b35a656-aaf5-4dfb-97fb-4b80d530e729" containerID="c664db3cb6141b0ebe690919484b0051471ebc0c271ecc246f9b2a5996858002" exitCode=0 Feb 25 16:46:03 crc kubenswrapper[4937]: I0225 16:46:03.819706 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533966-blbs6" event={"ID":"9b35a656-aaf5-4dfb-97fb-4b80d530e729","Type":"ContainerDied","Data":"c664db3cb6141b0ebe690919484b0051471ebc0c271ecc246f9b2a5996858002"} Feb 25 16:46:05 crc kubenswrapper[4937]: I0225 16:46:05.484646 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533966-blbs6" Feb 25 16:46:05 crc kubenswrapper[4937]: I0225 16:46:05.678090 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwvgc\" (UniqueName: \"kubernetes.io/projected/9b35a656-aaf5-4dfb-97fb-4b80d530e729-kube-api-access-mwvgc\") pod \"9b35a656-aaf5-4dfb-97fb-4b80d530e729\" (UID: \"9b35a656-aaf5-4dfb-97fb-4b80d530e729\") " Feb 25 16:46:05 crc kubenswrapper[4937]: I0225 16:46:05.688680 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b35a656-aaf5-4dfb-97fb-4b80d530e729-kube-api-access-mwvgc" (OuterVolumeSpecName: "kube-api-access-mwvgc") pod "9b35a656-aaf5-4dfb-97fb-4b80d530e729" (UID: "9b35a656-aaf5-4dfb-97fb-4b80d530e729"). InnerVolumeSpecName "kube-api-access-mwvgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:46:05 crc kubenswrapper[4937]: I0225 16:46:05.781910 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwvgc\" (UniqueName: \"kubernetes.io/projected/9b35a656-aaf5-4dfb-97fb-4b80d530e729-kube-api-access-mwvgc\") on node \"crc\" DevicePath \"\"" Feb 25 16:46:05 crc kubenswrapper[4937]: I0225 16:46:05.839913 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533966-blbs6" event={"ID":"9b35a656-aaf5-4dfb-97fb-4b80d530e729","Type":"ContainerDied","Data":"5b01d239b79d164082b59122dfd993d2d3836fb97a790439f1b8716b8f43d685"} Feb 25 16:46:05 crc kubenswrapper[4937]: I0225 16:46:05.839953 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b01d239b79d164082b59122dfd993d2d3836fb97a790439f1b8716b8f43d685" Feb 25 16:46:05 crc kubenswrapper[4937]: I0225 16:46:05.839985 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533966-blbs6" Feb 25 16:46:05 crc kubenswrapper[4937]: I0225 16:46:05.896820 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533960-g7p4h"] Feb 25 16:46:05 crc kubenswrapper[4937]: I0225 16:46:05.904573 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533960-g7p4h"] Feb 25 16:46:07 crc kubenswrapper[4937]: I0225 16:46:07.384361 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b" path="/var/lib/kubelet/pods/b4466a0e-fffa-4dbd-8b31-b96ef1dc9d7b/volumes" Feb 25 16:46:17 crc kubenswrapper[4937]: I0225 16:46:17.558936 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d773f4d2-bec3-4379-a7a2-29975a18c85b/init-config-reloader/0.log" Feb 25 16:46:17 crc kubenswrapper[4937]: I0225 16:46:17.715070 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d773f4d2-bec3-4379-a7a2-29975a18c85b/init-config-reloader/0.log" Feb 25 16:46:17 crc kubenswrapper[4937]: I0225 16:46:17.796304 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d773f4d2-bec3-4379-a7a2-29975a18c85b/alertmanager/0.log" Feb 25 16:46:17 crc kubenswrapper[4937]: I0225 16:46:17.812783 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d773f4d2-bec3-4379-a7a2-29975a18c85b/config-reloader/0.log" Feb 25 16:46:17 crc kubenswrapper[4937]: I0225 16:46:17.935946 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c757b5c5d-sqs2g_11e5858d-ee0a-4f76-8863-25be5ef4df36/barbican-api/0.log" Feb 25 16:46:18 crc kubenswrapper[4937]: I0225 16:46:18.010461 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c757b5c5d-sqs2g_11e5858d-ee0a-4f76-8863-25be5ef4df36/barbican-api-log/0.log" Feb 25 16:46:18 crc kubenswrapper[4937]: I0225 16:46:18.095089 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6b9499bbcd-kr7kb_6520b70f-9bf6-4b3c-ad1e-4f43da8daec5/barbican-keystone-listener/0.log" Feb 25 16:46:18 crc kubenswrapper[4937]: I0225 16:46:18.207912 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6b9499bbcd-kr7kb_6520b70f-9bf6-4b3c-ad1e-4f43da8daec5/barbican-keystone-listener-log/0.log" Feb 25 16:46:18 crc kubenswrapper[4937]: I0225 16:46:18.228678 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6b99d98bc-2r54q_394cbe6e-1697-449d-abaf-68e9ba275096/barbican-worker/0.log" Feb 25 16:46:18 crc kubenswrapper[4937]: I0225 16:46:18.368954 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6b99d98bc-2r54q_394cbe6e-1697-449d-abaf-68e9ba275096/barbican-worker-log/0.log" Feb 25 16:46:18 crc kubenswrapper[4937]: I0225 16:46:18.579613 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc_165310e1-208b-4a29-a8fd-be630d60fc08/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:18 crc kubenswrapper[4937]: I0225 16:46:18.678040 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4d9d51be-46d2-4d06-8f81-f34e8693e52d/ceilometer-notification-agent/0.log" Feb 25 16:46:18 crc kubenswrapper[4937]: I0225 16:46:18.723868 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4d9d51be-46d2-4d06-8f81-f34e8693e52d/ceilometer-central-agent/0.log" Feb 25 16:46:18 crc kubenswrapper[4937]: I0225 16:46:18.828987 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4d9d51be-46d2-4d06-8f81-f34e8693e52d/proxy-httpd/0.log" Feb 25 16:46:18 crc kubenswrapper[4937]: I0225 16:46:18.872147 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4d9d51be-46d2-4d06-8f81-f34e8693e52d/sg-core/0.log" Feb 25 16:46:18 crc kubenswrapper[4937]: I0225 16:46:18.964828 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5716a1da-2a42-48cd-96cd-149adb030006/cinder-api/0.log" Feb 25 16:46:19 crc kubenswrapper[4937]: I0225 16:46:19.056868 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5716a1da-2a42-48cd-96cd-149adb030006/cinder-api-log/0.log" Feb 25 16:46:19 crc kubenswrapper[4937]: I0225 16:46:19.155137 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f6b49871-d3a3-4846-9c5d-3df7b920a420/cinder-scheduler/0.log" Feb 25 16:46:19 crc kubenswrapper[4937]: I0225 16:46:19.194392 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f6b49871-d3a3-4846-9c5d-3df7b920a420/probe/0.log" Feb 25 16:46:19 crc kubenswrapper[4937]: I0225 16:46:19.405528 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_5787788d-bec1-4541-a34d-26ab6b7f4aa5/cloudkitty-api/0.log" Feb 25 16:46:19 crc kubenswrapper[4937]: I0225 16:46:19.415737 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_5787788d-bec1-4541-a34d-26ab6b7f4aa5/cloudkitty-api-log/0.log" Feb 25 16:46:19 crc kubenswrapper[4937]: I0225 16:46:19.635129 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_123b3439-f7ab-44b4-bbed-02539668cf80/loki-compactor/0.log" Feb 25 16:46:19 crc kubenswrapper[4937]: I0225 16:46:19.963228 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-585d9bcbc-mplc7_48e5f2c6-d4ed-48a1-8737-693b54c43613/loki-distributor/0.log" Feb 25 16:46:20 crc kubenswrapper[4937]: I0225 16:46:20.068776 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-t9wss_e13a5d7c-5a1a-466b-83a0-d76859e2cd3e/gateway/0.log" Feb 25 16:46:20 crc kubenswrapper[4937]: I0225 16:46:20.166775 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-v7lt7_836ae71a-cf0f-4a00-a0bc-78d1be68f830/gateway/0.log" Feb 25 16:46:20 crc kubenswrapper[4937]: I0225 16:46:20.417383 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_08382e6d-e8e5-4656-a524-26c8269114fd/loki-ingester/0.log" Feb 25 16:46:20 crc kubenswrapper[4937]: I0225 16:46:20.483011 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_c4c8966b-44e5-42fd-ae20-d3099876ee36/loki-index-gateway/0.log" Feb 25 16:46:20 crc kubenswrapper[4937]: I0225 16:46:20.747670 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p_5d351b94-5168-4f7f-9d70-c2cd2225dba8/loki-query-frontend/0.log" Feb 25 16:46:20 crc kubenswrapper[4937]: I0225 16:46:20.893428 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-58c84b5844-86646_f72068e0-28e8-4c10-abeb-c067fe29c2f4/loki-querier/0.log" Feb 25 16:46:21 crc kubenswrapper[4937]: I0225 16:46:21.187597 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-h5j54_d0caaa2f-df02-4bb7-a490-f3333d6c47a2/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:21 crc kubenswrapper[4937]: I0225 16:46:21.384089 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp_1bd696e6-be36-4b9e-9f00-9ba293305842/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:21 crc kubenswrapper[4937]: I0225 16:46:21.578288 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-hjwwn_4ec5a514-9a47-413c-8e18-113b8295e0b7/init/0.log" Feb 25 16:46:21 crc kubenswrapper[4937]: I0225 16:46:21.689319 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-hjwwn_4ec5a514-9a47-413c-8e18-113b8295e0b7/init/0.log" Feb 25 16:46:21 crc kubenswrapper[4937]: I0225 16:46:21.935522 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-29n8w_e34d42d5-94de-45fe-b002-65da3cd1d49d/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:21 crc kubenswrapper[4937]: I0225 16:46:21.995175 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-hjwwn_4ec5a514-9a47-413c-8e18-113b8295e0b7/dnsmasq-dns/0.log" Feb 25 16:46:22 crc kubenswrapper[4937]: I0225 16:46:22.108540 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c50d4693-04e4-40a4-a07d-9475ce9b0125/glance-httpd/0.log" Feb 25 16:46:22 crc kubenswrapper[4937]: I0225 16:46:22.210769 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c50d4693-04e4-40a4-a07d-9475ce9b0125/glance-log/0.log" Feb 25 16:46:22 crc kubenswrapper[4937]: I0225 16:46:22.302460 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_55cf40d2-3819-46a7-b9c1-aad7f3a65542/glance-httpd/0.log" Feb 25 16:46:22 crc kubenswrapper[4937]: I0225 16:46:22.435066 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_55cf40d2-3819-46a7-b9c1-aad7f3a65542/glance-log/0.log" Feb 25 16:46:22 crc kubenswrapper[4937]: I0225 16:46:22.464129 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-r74cs_d08f7150-84a3-42bf-bed8-624a7f5e2c35/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:22 crc kubenswrapper[4937]: I0225 16:46:22.847243 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-2mgkn_4784f56a-332c-45b1-b121-ec925aece823/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:22 crc kubenswrapper[4937]: I0225 16:46:22.974453 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_b0312225-730b-46a3-8142-6a39e9d69f60/kube-state-metrics/0.log" Feb 25 16:46:23 crc kubenswrapper[4937]: I0225 16:46:23.197543 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-bzxct_b1499078-381f-48bd-bcfb-c9bd057fa5d2/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:23 crc kubenswrapper[4937]: I0225 16:46:23.221642 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7595479948-g6dtl_e3d2f89f-c1af-45d5-bfdd-6f9c3141c124/keystone-api/0.log" Feb 25 16:46:23 crc kubenswrapper[4937]: I0225 16:46:23.834166 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57ff6d8577-ntrmb_351a0bd5-2cd4-4f52-af68-6d86a512add0/neutron-httpd/0.log" Feb 25 16:46:23 crc kubenswrapper[4937]: I0225 16:46:23.867683 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57ff6d8577-ntrmb_351a0bd5-2cd4-4f52-af68-6d86a512add0/neutron-api/0.log" Feb 25 16:46:24 crc kubenswrapper[4937]: I0225 16:46:24.086385 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss_9851d2ed-9455-4797-bcad-ed3b82909df5/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:24 crc kubenswrapper[4937]: I0225 16:46:24.698061 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_e90a8e88-5fc8-48fe-af70-c6f6553d8b62/nova-api-log/0.log" Feb 25 16:46:24 crc kubenswrapper[4937]: I0225 16:46:24.800703 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c69df688-29bc-47e8-98ef-56b506f9e7c1/nova-cell0-conductor-conductor/0.log" Feb 25 16:46:24 crc kubenswrapper[4937]: I0225 16:46:24.848995 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_e90a8e88-5fc8-48fe-af70-c6f6553d8b62/nova-api-api/0.log" Feb 25 16:46:25 crc kubenswrapper[4937]: I0225 16:46:25.139497 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f95d4ab5-6b4e-477f-848e-0b98b93c8ba1/nova-cell1-conductor-conductor/0.log" Feb 25 16:46:25 crc kubenswrapper[4937]: I0225 16:46:25.277309 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d447480a-3bd1-4934-9ba7-73122b37df7c/nova-cell1-novncproxy-novncproxy/0.log" Feb 25 16:46:25 crc kubenswrapper[4937]: I0225 16:46:25.522937 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-9w4zb_dbc3ffd6-39f1-4130-9083-033d890d558d/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:25 crc kubenswrapper[4937]: I0225 16:46:25.699407 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9d7981f5-bfaf-41a0-a577-ab25b40dc375/nova-metadata-log/0.log" Feb 25 16:46:26 crc kubenswrapper[4937]: I0225 16:46:26.175058 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_987df1b8-49a2-4ec2-b92d-64619d55b516/nova-scheduler-scheduler/0.log" Feb 25 16:46:26 crc kubenswrapper[4937]: I0225 16:46:26.477929 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9e2484d7-6d50-43d2-9105-e83280f565ac/mysql-bootstrap/0.log" Feb 25 16:46:26 crc kubenswrapper[4937]: I0225 16:46:26.644147 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9e2484d7-6d50-43d2-9105-e83280f565ac/mysql-bootstrap/0.log" Feb 25 16:46:26 crc kubenswrapper[4937]: I0225 16:46:26.691245 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9e2484d7-6d50-43d2-9105-e83280f565ac/galera/0.log" Feb 25 16:46:26 crc kubenswrapper[4937]: I0225 16:46:26.917642 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9d7981f5-bfaf-41a0-a577-ab25b40dc375/nova-metadata-metadata/0.log" Feb 25 16:46:27 crc kubenswrapper[4937]: I0225 16:46:27.256291 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe/mysql-bootstrap/0.log" Feb 25 16:46:27 crc kubenswrapper[4937]: I0225 16:46:27.409640 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe/mysql-bootstrap/0.log" Feb 25 16:46:27 crc kubenswrapper[4937]: I0225 16:46:27.451616 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe/galera/0.log" Feb 25 16:46:27 crc kubenswrapper[4937]: I0225 16:46:27.719142 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a03635e4-24a3-460b-ab0e-e3f677ac95c5/openstackclient/0.log" Feb 25 16:46:27 crc kubenswrapper[4937]: I0225 16:46:27.981004 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-lkm6g_d9c73995-8885-40f2-8491-6216d1ec5c7b/openstack-network-exporter/0.log" Feb 25 16:46:28 crc kubenswrapper[4937]: I0225 16:46:28.064436 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4rsxl_388f0d04-d580-46ae-a729-667d81ad11a0/ovsdb-server-init/0.log" Feb 25 16:46:28 crc kubenswrapper[4937]: I0225 16:46:28.205250 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4rsxl_388f0d04-d580-46ae-a729-667d81ad11a0/ovs-vswitchd/0.log" Feb 25 16:46:28 crc kubenswrapper[4937]: I0225 16:46:28.289787 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4rsxl_388f0d04-d580-46ae-a729-667d81ad11a0/ovsdb-server-init/0.log" Feb 25 16:46:28 crc kubenswrapper[4937]: I0225 16:46:28.310921 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4rsxl_388f0d04-d580-46ae-a729-667d81ad11a0/ovsdb-server/0.log" Feb 25 16:46:28 crc kubenswrapper[4937]: I0225 16:46:28.594923 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-sgdhb_c0b0baed-3140-4ac4-9d27-e8fc15c390c2/ovn-controller/0.log" Feb 25 16:46:28 crc kubenswrapper[4937]: I0225 16:46:28.710908 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-4mrhf_af8ba197-d732-4514-9b22-4d2aa6f5a7f6/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:28 crc kubenswrapper[4937]: I0225 16:46:28.941782 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8ce79682-d3ee-4afb-ba50-fdacc0fe6910/ovn-northd/0.log" Feb 25 16:46:28 crc kubenswrapper[4937]: I0225 16:46:28.976582 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8ce79682-d3ee-4afb-ba50-fdacc0fe6910/openstack-network-exporter/0.log" Feb 25 16:46:29 crc kubenswrapper[4937]: I0225 16:46:29.160226 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0a044be7-a58d-4684-8252-5a850694fb04/openstack-network-exporter/0.log" Feb 25 16:46:29 crc kubenswrapper[4937]: I0225 16:46:29.209843 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0a044be7-a58d-4684-8252-5a850694fb04/ovsdbserver-nb/0.log" Feb 25 16:46:29 crc kubenswrapper[4937]: I0225 16:46:29.381271 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08/openstack-network-exporter/0.log" Feb 25 16:46:29 crc kubenswrapper[4937]: I0225 16:46:29.447992 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08/ovsdbserver-sb/0.log" Feb 25 16:46:29 crc kubenswrapper[4937]: I0225 16:46:29.767466 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-575b75bdd-mz6p6_b9abdfa0-773f-4d50-ae2d-8d7a429b5df7/placement-log/0.log" Feb 25 16:46:29 crc kubenswrapper[4937]: I0225 16:46:29.830740 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-575b75bdd-mz6p6_b9abdfa0-773f-4d50-ae2d-8d7a429b5df7/placement-api/0.log" Feb 25 16:46:30 crc kubenswrapper[4937]: I0225 16:46:30.322562 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a/init-config-reloader/0.log" Feb 25 16:46:30 crc kubenswrapper[4937]: I0225 16:46:30.396686 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a/config-reloader/0.log" Feb 25 16:46:30 crc kubenswrapper[4937]: I0225 16:46:30.482106 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a/init-config-reloader/0.log" Feb 25 16:46:30 crc kubenswrapper[4937]: I0225 16:46:30.574299 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a/prometheus/0.log" Feb 25 16:46:30 crc kubenswrapper[4937]: I0225 16:46:30.657555 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a/thanos-sidecar/0.log" Feb 25 16:46:30 crc kubenswrapper[4937]: I0225 16:46:30.866850 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ab7e006f-0788-42e5-aee9-543e29514c09/setup-container/0.log" Feb 25 16:46:31 crc kubenswrapper[4937]: I0225 16:46:31.124726 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ab7e006f-0788-42e5-aee9-543e29514c09/setup-container/0.log" Feb 25 16:46:31 crc kubenswrapper[4937]: I0225 16:46:31.221140 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ab7e006f-0788-42e5-aee9-543e29514c09/rabbitmq/0.log" Feb 25 16:46:31 crc kubenswrapper[4937]: I0225 16:46:31.350038 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4779f4bd-7580-49e7-b536-ce3b8c77a8d4/setup-container/0.log" Feb 25 16:46:31 crc kubenswrapper[4937]: I0225 16:46:31.593688 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4779f4bd-7580-49e7-b536-ce3b8c77a8d4/rabbitmq/0.log" Feb 25 16:46:31 crc kubenswrapper[4937]: I0225 16:46:31.624084 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4779f4bd-7580-49e7-b536-ce3b8c77a8d4/setup-container/0.log" Feb 25 16:46:31 crc kubenswrapper[4937]: I0225 16:46:31.875979 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv_7edeb14d-a4c4-402a-a45f-b30a6f23ffe9/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:32 crc kubenswrapper[4937]: I0225 16:46:32.002726 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-78hr5_9cb9799d-3115-4657-a7f3-18fbcb14a073/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:32 crc kubenswrapper[4937]: I0225 16:46:32.212542 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j_6ee081e9-3c3e-4bd7-9c7d-a4a917946879/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:32 crc kubenswrapper[4937]: I0225 16:46:32.499782 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-hxbrd_f9e83917-e8a9-4ec3-9714-591147de094e/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:32 crc kubenswrapper[4937]: I0225 16:46:32.603840 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-v5wvt_2793b4d3-40ec-416d-8a93-0bb9b23ab909/ssh-known-hosts-edpm-deployment/0.log" Feb 25 16:46:32 crc kubenswrapper[4937]: I0225 16:46:32.832297 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-75675bb4d7-q28jd_c0193db4-c078-4d8c-8437-538da8d426d2/proxy-server/0.log" Feb 25 16:46:32 crc kubenswrapper[4937]: I0225 16:46:32.837077 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-75675bb4d7-q28jd_c0193db4-c078-4d8c-8437-538da8d426d2/proxy-httpd/0.log" Feb 25 16:46:33 crc kubenswrapper[4937]: I0225 16:46:33.287162 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-6nkbm_0a0f0530-95e1-4231-9933-bedb49b72a88/swift-ring-rebalance/0.log" Feb 25 16:46:33 crc kubenswrapper[4937]: I0225 16:46:33.514588 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/account-auditor/0.log" Feb 25 16:46:33 crc kubenswrapper[4937]: I0225 16:46:33.533047 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/account-reaper/0.log" Feb 25 16:46:33 crc kubenswrapper[4937]: I0225 16:46:33.555234 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_d41b7062-11e8-401a-a063-8467cf1da4f2/cloudkitty-proc/0.log" Feb 25 16:46:33 crc kubenswrapper[4937]: I0225 16:46:33.676944 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/account-replicator/0.log" Feb 25 16:46:33 crc kubenswrapper[4937]: I0225 16:46:33.738401 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/container-auditor/0.log" Feb 25 16:46:33 crc kubenswrapper[4937]: I0225 16:46:33.806007 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/account-server/0.log" Feb 25 16:46:33 crc kubenswrapper[4937]: I0225 16:46:33.839618 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/container-replicator/0.log" Feb 25 16:46:33 crc kubenswrapper[4937]: I0225 16:46:33.969847 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/container-server/0.log" Feb 25 16:46:34 crc kubenswrapper[4937]: I0225 16:46:34.067775 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/container-updater/0.log" Feb 25 16:46:34 crc kubenswrapper[4937]: I0225 16:46:34.091202 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/object-auditor/0.log" Feb 25 16:46:34 crc kubenswrapper[4937]: I0225 16:46:34.104160 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/object-expirer/0.log" Feb 25 16:46:34 crc kubenswrapper[4937]: I0225 16:46:34.248299 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/object-replicator/0.log" Feb 25 16:46:34 crc kubenswrapper[4937]: I0225 16:46:34.377017 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/object-updater/0.log" Feb 25 16:46:34 crc kubenswrapper[4937]: I0225 16:46:34.440151 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/object-server/0.log" Feb 25 16:46:34 crc kubenswrapper[4937]: I0225 16:46:34.474781 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/rsync/0.log" Feb 25 16:46:34 crc kubenswrapper[4937]: I0225 16:46:34.678799 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/swift-recon-cron/0.log" Feb 25 16:46:34 crc kubenswrapper[4937]: I0225 16:46:34.807249 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-g87qr_a6ef0688-25f8-4018-8976-30334bf11136/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:34 crc kubenswrapper[4937]: I0225 16:46:34.989637 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_f62befdf-83b6-4767-8de5-d552bb54e3f9/test-operator-logs-container/0.log" Feb 25 16:46:35 crc kubenswrapper[4937]: I0225 16:46:35.028122 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_994bcbfb-8270-42b1-bc77-6a262f2d29e3/tempest-tests-tempest-tests-runner/0.log" Feb 25 16:46:35 crc kubenswrapper[4937]: I0225 16:46:35.168626 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t_cceb45e3-0685-45fb-b7c3-cf18ccb0649b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:46:37 crc kubenswrapper[4937]: I0225 16:46:37.335631 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_2ddd71fd-4c47-4357-87e7-16a2010a23df/memcached/0.log" Feb 25 16:46:43 crc kubenswrapper[4937]: I0225 16:46:43.516457 4937 scope.go:117] "RemoveContainer" containerID="2680c959e1b83254b0c4e7a20d96a327e2affa1f6a82c163ec8e25814331b94a" Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.277246 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cx2ct"] Feb 25 16:47:00 crc kubenswrapper[4937]: E0225 16:47:00.278416 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b35a656-aaf5-4dfb-97fb-4b80d530e729" containerName="oc" Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.278437 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b35a656-aaf5-4dfb-97fb-4b80d530e729" containerName="oc" Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.278747 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b35a656-aaf5-4dfb-97fb-4b80d530e729" containerName="oc" Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.280599 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.302905 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a53ecf-fa28-4628-bece-dfd41abb0ed6-catalog-content\") pod \"community-operators-cx2ct\" (UID: \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\") " pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.302958 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a53ecf-fa28-4628-bece-dfd41abb0ed6-utilities\") pod \"community-operators-cx2ct\" (UID: \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\") " pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.303252 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-647gd\" (UniqueName: \"kubernetes.io/projected/49a53ecf-fa28-4628-bece-dfd41abb0ed6-kube-api-access-647gd\") pod \"community-operators-cx2ct\" (UID: \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\") " pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.312229 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cx2ct"] Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.404756 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a53ecf-fa28-4628-bece-dfd41abb0ed6-catalog-content\") pod \"community-operators-cx2ct\" (UID: \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\") " pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.404802 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a53ecf-fa28-4628-bece-dfd41abb0ed6-utilities\") pod \"community-operators-cx2ct\" (UID: \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\") " pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.404875 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-647gd\" (UniqueName: \"kubernetes.io/projected/49a53ecf-fa28-4628-bece-dfd41abb0ed6-kube-api-access-647gd\") pod \"community-operators-cx2ct\" (UID: \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\") " pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.423989 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a53ecf-fa28-4628-bece-dfd41abb0ed6-utilities\") pod \"community-operators-cx2ct\" (UID: \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\") " pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.429962 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a53ecf-fa28-4628-bece-dfd41abb0ed6-catalog-content\") pod \"community-operators-cx2ct\" (UID: \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\") " pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.438465 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-647gd\" (UniqueName: \"kubernetes.io/projected/49a53ecf-fa28-4628-bece-dfd41abb0ed6-kube-api-access-647gd\") pod \"community-operators-cx2ct\" (UID: \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\") " pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:00 crc kubenswrapper[4937]: I0225 16:47:00.600394 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:01 crc kubenswrapper[4937]: I0225 16:47:01.174998 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cx2ct"] Feb 25 16:47:01 crc kubenswrapper[4937]: I0225 16:47:01.329394 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx2ct" event={"ID":"49a53ecf-fa28-4628-bece-dfd41abb0ed6","Type":"ContainerStarted","Data":"ef5220013fe004db1ecd79a2f38eea9482399b6736dd72d16785a1cd9559c78a"} Feb 25 16:47:02 crc kubenswrapper[4937]: I0225 16:47:02.337857 4937 generic.go:334] "Generic (PLEG): container finished" podID="49a53ecf-fa28-4628-bece-dfd41abb0ed6" containerID="a0778be5b3c9f9989c619a1554af89870e141431855ee6dd7b7a62726719ac07" exitCode=0 Feb 25 16:47:02 crc kubenswrapper[4937]: I0225 16:47:02.338109 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx2ct" event={"ID":"49a53ecf-fa28-4628-bece-dfd41abb0ed6","Type":"ContainerDied","Data":"a0778be5b3c9f9989c619a1554af89870e141431855ee6dd7b7a62726719ac07"} Feb 25 16:47:03 crc kubenswrapper[4937]: I0225 16:47:03.350941 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx2ct" event={"ID":"49a53ecf-fa28-4628-bece-dfd41abb0ed6","Type":"ContainerStarted","Data":"169e45adc56382ea8e00fa69eac42de232a6e8fb28130a21764027894213254e"} Feb 25 16:47:05 crc kubenswrapper[4937]: I0225 16:47:05.373524 4937 generic.go:334] "Generic (PLEG): container finished" podID="49a53ecf-fa28-4628-bece-dfd41abb0ed6" containerID="169e45adc56382ea8e00fa69eac42de232a6e8fb28130a21764027894213254e" exitCode=0 Feb 25 16:47:05 crc kubenswrapper[4937]: I0225 16:47:05.384441 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx2ct" event={"ID":"49a53ecf-fa28-4628-bece-dfd41abb0ed6","Type":"ContainerDied","Data":"169e45adc56382ea8e00fa69eac42de232a6e8fb28130a21764027894213254e"} Feb 25 16:47:06 crc kubenswrapper[4937]: I0225 16:47:06.227498 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7_7671f573-d466-4764-9094-4cc7250e6d3d/util/0.log" Feb 25 16:47:06 crc kubenswrapper[4937]: I0225 16:47:06.384375 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx2ct" event={"ID":"49a53ecf-fa28-4628-bece-dfd41abb0ed6","Type":"ContainerStarted","Data":"9fe8711145d4376323b4117466d64051b7b97e852b6202691ff707925615b3cb"} Feb 25 16:47:06 crc kubenswrapper[4937]: I0225 16:47:06.409382 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cx2ct" podStartSLOduration=2.659917771 podStartE2EDuration="6.409358812s" podCreationTimestamp="2026-02-25 16:47:00 +0000 UTC" firstStartedPulling="2026-02-25 16:47:02.34167204 +0000 UTC m=+3673.355063930" lastFinishedPulling="2026-02-25 16:47:06.091113081 +0000 UTC m=+3677.104504971" observedRunningTime="2026-02-25 16:47:06.401262139 +0000 UTC m=+3677.414654029" watchObservedRunningTime="2026-02-25 16:47:06.409358812 +0000 UTC m=+3677.422750702" Feb 25 16:47:06 crc kubenswrapper[4937]: I0225 16:47:06.430636 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7_7671f573-d466-4764-9094-4cc7250e6d3d/util/0.log" Feb 25 16:47:06 crc kubenswrapper[4937]: I0225 16:47:06.496144 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7_7671f573-d466-4764-9094-4cc7250e6d3d/pull/0.log" Feb 25 16:47:06 crc kubenswrapper[4937]: I0225 16:47:06.519256 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7_7671f573-d466-4764-9094-4cc7250e6d3d/pull/0.log" Feb 25 16:47:06 crc kubenswrapper[4937]: I0225 16:47:06.712864 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7_7671f573-d466-4764-9094-4cc7250e6d3d/util/0.log" Feb 25 16:47:06 crc kubenswrapper[4937]: I0225 16:47:06.760606 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7_7671f573-d466-4764-9094-4cc7250e6d3d/pull/0.log" Feb 25 16:47:06 crc kubenswrapper[4937]: I0225 16:47:06.778928 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7_7671f573-d466-4764-9094-4cc7250e6d3d/extract/0.log" Feb 25 16:47:07 crc kubenswrapper[4937]: I0225 16:47:07.601720 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-95dsz_5c7c6408-d0c4-42ea-ae7b-e10b49e13355/manager/0.log" Feb 25 16:47:08 crc kubenswrapper[4937]: I0225 16:47:08.036218 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-q92r7_079024f7-46b2-46fa-b96b-e4dca470cb4b/manager/0.log" Feb 25 16:47:08 crc kubenswrapper[4937]: I0225 16:47:08.346230 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-k7z4s_8132d735-0341-43be-93de-730c15511083/manager/0.log" Feb 25 16:47:08 crc kubenswrapper[4937]: I0225 16:47:08.738307 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-lkw74_806dde6d-ac75-47d7-98e2-0ba5959614a3/manager/0.log" Feb 25 16:47:08 crc kubenswrapper[4937]: I0225 16:47:08.861587 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-lnw2m_7b01abed-0e59-495b-8b5e-2229c8d3215f/manager/0.log" Feb 25 16:47:09 crc kubenswrapper[4937]: I0225 16:47:09.139778 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-xhxm2_7f4f0820-dd56-4d0b-aa5e-70dcab23e568/manager/0.log" Feb 25 16:47:09 crc kubenswrapper[4937]: I0225 16:47:09.379945 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-vjpzc_88ef567f-e68d-47aa-9788-4307003a77a0/manager/0.log" Feb 25 16:47:09 crc kubenswrapper[4937]: I0225 16:47:09.416553 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-sw29j_18df78fd-5382-4716-9708-4e669508c898/manager/0.log" Feb 25 16:47:09 crc kubenswrapper[4937]: I0225 16:47:09.618219 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-g82nw_6cb3892f-a950-4dc7-9b9b-0db2876c569d/manager/0.log" Feb 25 16:47:10 crc kubenswrapper[4937]: I0225 16:47:10.076169 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-ntf28_b688ff11-a838-4c26-90bd-974c871f4d44/manager/0.log" Feb 25 16:47:10 crc kubenswrapper[4937]: I0225 16:47:10.180011 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-bz4hc_23514cd7-1535-4c0a-a090-68c39654dad2/manager/0.log" Feb 25 16:47:10 crc kubenswrapper[4937]: I0225 16:47:10.416336 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-n64lk_42c84a2f-b585-49c5-adb6-fb83ffecef77/manager/0.log" Feb 25 16:47:10 crc kubenswrapper[4937]: I0225 16:47:10.581859 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-vh5zk_4e98f637-2524-43db-9b27-4bd68ae19bf4/manager/0.log" Feb 25 16:47:10 crc kubenswrapper[4937]: I0225 16:47:10.601683 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:10 crc kubenswrapper[4937]: I0225 16:47:10.601755 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:10 crc kubenswrapper[4937]: I0225 16:47:10.655810 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:10 crc kubenswrapper[4937]: I0225 16:47:10.755977 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9_00b4788a-4566-469f-8731-51700725fea0/manager/0.log" Feb 25 16:47:11 crc kubenswrapper[4937]: I0225 16:47:11.212971 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5f5c559654-md6zv_380b8472-bb7f-421e-8a0a-7da8078b6ecc/operator/0.log" Feb 25 16:47:11 crc kubenswrapper[4937]: I0225 16:47:11.518767 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:11 crc kubenswrapper[4937]: I0225 16:47:11.568748 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7hxvd_65bfe4a4-8d7d-48bc-823a-5b388022052f/registry-server/0.log" Feb 25 16:47:11 crc kubenswrapper[4937]: I0225 16:47:11.581373 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cx2ct"] Feb 25 16:47:11 crc kubenswrapper[4937]: I0225 16:47:11.854619 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-vt5mw_e6bcab89-8beb-4879-8596-3a24805bd835/manager/0.log" Feb 25 16:47:11 crc kubenswrapper[4937]: I0225 16:47:11.921459 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-z2bqs_b8448aa3-7cd0-4732-ad80-99fbefc125a6/manager/0.log" Feb 25 16:47:12 crc kubenswrapper[4937]: I0225 16:47:12.213250 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-hmm4d_c21d7933-3e35-48d9-8946-5ffdcc7a42bf/operator/0.log" Feb 25 16:47:12 crc kubenswrapper[4937]: I0225 16:47:12.489019 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-s8x4b_2dde13f7-ba29-4c24-94e0-052d622fe88c/manager/0.log" Feb 25 16:47:12 crc kubenswrapper[4937]: I0225 16:47:12.789535 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-mtqtq_b8a9d073-1b33-4184-8727-28c957c96e5f/manager/0.log" Feb 25 16:47:13 crc kubenswrapper[4937]: I0225 16:47:13.014830 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-pkrhd_5d73c9f2-ead1-410a-ad35-16b7ba251daa/manager/0.log" Feb 25 16:47:13 crc kubenswrapper[4937]: I0225 16:47:13.208629 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-78747bd5c7-dtngf_eefbad00-59b6-4e7c-b056-ba07663a665f/manager/0.log" Feb 25 16:47:13 crc kubenswrapper[4937]: I0225 16:47:13.318158 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-fbcb9db89-8spmv_2007fabb-e6dd-4713-823d-f6a8a3cd41f1/manager/0.log" Feb 25 16:47:13 crc kubenswrapper[4937]: I0225 16:47:13.467903 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cx2ct" podUID="49a53ecf-fa28-4628-bece-dfd41abb0ed6" containerName="registry-server" containerID="cri-o://9fe8711145d4376323b4117466d64051b7b97e852b6202691ff707925615b3cb" gracePeriod=2 Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.100697 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-72hwb_1e2c3857-1279-466f-8da3-ea1f5cf13893/manager/0.log" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.251414 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.419856 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-647gd\" (UniqueName: \"kubernetes.io/projected/49a53ecf-fa28-4628-bece-dfd41abb0ed6-kube-api-access-647gd\") pod \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\" (UID: \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\") " Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.419997 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a53ecf-fa28-4628-bece-dfd41abb0ed6-utilities\") pod \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\" (UID: \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\") " Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.420025 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a53ecf-fa28-4628-bece-dfd41abb0ed6-catalog-content\") pod \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\" (UID: \"49a53ecf-fa28-4628-bece-dfd41abb0ed6\") " Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.424961 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a53ecf-fa28-4628-bece-dfd41abb0ed6-utilities" (OuterVolumeSpecName: "utilities") pod "49a53ecf-fa28-4628-bece-dfd41abb0ed6" (UID: "49a53ecf-fa28-4628-bece-dfd41abb0ed6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.431347 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a53ecf-fa28-4628-bece-dfd41abb0ed6-kube-api-access-647gd" (OuterVolumeSpecName: "kube-api-access-647gd") pod "49a53ecf-fa28-4628-bece-dfd41abb0ed6" (UID: "49a53ecf-fa28-4628-bece-dfd41abb0ed6"). InnerVolumeSpecName "kube-api-access-647gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.476232 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a53ecf-fa28-4628-bece-dfd41abb0ed6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49a53ecf-fa28-4628-bece-dfd41abb0ed6" (UID: "49a53ecf-fa28-4628-bece-dfd41abb0ed6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.479168 4937 generic.go:334] "Generic (PLEG): container finished" podID="49a53ecf-fa28-4628-bece-dfd41abb0ed6" containerID="9fe8711145d4376323b4117466d64051b7b97e852b6202691ff707925615b3cb" exitCode=0 Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.479210 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx2ct" event={"ID":"49a53ecf-fa28-4628-bece-dfd41abb0ed6","Type":"ContainerDied","Data":"9fe8711145d4376323b4117466d64051b7b97e852b6202691ff707925615b3cb"} Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.479237 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx2ct" event={"ID":"49a53ecf-fa28-4628-bece-dfd41abb0ed6","Type":"ContainerDied","Data":"ef5220013fe004db1ecd79a2f38eea9482399b6736dd72d16785a1cd9559c78a"} Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.479253 4937 scope.go:117] "RemoveContainer" containerID="9fe8711145d4376323b4117466d64051b7b97e852b6202691ff707925615b3cb" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.479398 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx2ct" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.518502 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cx2ct"] Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.523782 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a53ecf-fa28-4628-bece-dfd41abb0ed6-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.523815 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a53ecf-fa28-4628-bece-dfd41abb0ed6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.523826 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-647gd\" (UniqueName: \"kubernetes.io/projected/49a53ecf-fa28-4628-bece-dfd41abb0ed6-kube-api-access-647gd\") on node \"crc\" DevicePath \"\"" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.524619 4937 scope.go:117] "RemoveContainer" containerID="169e45adc56382ea8e00fa69eac42de232a6e8fb28130a21764027894213254e" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.527329 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cx2ct"] Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.565210 4937 scope.go:117] "RemoveContainer" containerID="a0778be5b3c9f9989c619a1554af89870e141431855ee6dd7b7a62726719ac07" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.615934 4937 scope.go:117] "RemoveContainer" containerID="9fe8711145d4376323b4117466d64051b7b97e852b6202691ff707925615b3cb" Feb 25 16:47:14 crc kubenswrapper[4937]: E0225 16:47:14.616321 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fe8711145d4376323b4117466d64051b7b97e852b6202691ff707925615b3cb\": container with ID starting with 9fe8711145d4376323b4117466d64051b7b97e852b6202691ff707925615b3cb not found: ID does not exist" containerID="9fe8711145d4376323b4117466d64051b7b97e852b6202691ff707925615b3cb" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.616351 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fe8711145d4376323b4117466d64051b7b97e852b6202691ff707925615b3cb"} err="failed to get container status \"9fe8711145d4376323b4117466d64051b7b97e852b6202691ff707925615b3cb\": rpc error: code = NotFound desc = could not find container \"9fe8711145d4376323b4117466d64051b7b97e852b6202691ff707925615b3cb\": container with ID starting with 9fe8711145d4376323b4117466d64051b7b97e852b6202691ff707925615b3cb not found: ID does not exist" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.616370 4937 scope.go:117] "RemoveContainer" containerID="169e45adc56382ea8e00fa69eac42de232a6e8fb28130a21764027894213254e" Feb 25 16:47:14 crc kubenswrapper[4937]: E0225 16:47:14.617009 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"169e45adc56382ea8e00fa69eac42de232a6e8fb28130a21764027894213254e\": container with ID starting with 169e45adc56382ea8e00fa69eac42de232a6e8fb28130a21764027894213254e not found: ID does not exist" containerID="169e45adc56382ea8e00fa69eac42de232a6e8fb28130a21764027894213254e" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.617030 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"169e45adc56382ea8e00fa69eac42de232a6e8fb28130a21764027894213254e"} err="failed to get container status \"169e45adc56382ea8e00fa69eac42de232a6e8fb28130a21764027894213254e\": rpc error: code = NotFound desc = could not find container \"169e45adc56382ea8e00fa69eac42de232a6e8fb28130a21764027894213254e\": container with ID starting with 169e45adc56382ea8e00fa69eac42de232a6e8fb28130a21764027894213254e not found: ID does not exist" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.617043 4937 scope.go:117] "RemoveContainer" containerID="a0778be5b3c9f9989c619a1554af89870e141431855ee6dd7b7a62726719ac07" Feb 25 16:47:14 crc kubenswrapper[4937]: E0225 16:47:14.617193 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0778be5b3c9f9989c619a1554af89870e141431855ee6dd7b7a62726719ac07\": container with ID starting with a0778be5b3c9f9989c619a1554af89870e141431855ee6dd7b7a62726719ac07 not found: ID does not exist" containerID="a0778be5b3c9f9989c619a1554af89870e141431855ee6dd7b7a62726719ac07" Feb 25 16:47:14 crc kubenswrapper[4937]: I0225 16:47:14.617212 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0778be5b3c9f9989c619a1554af89870e141431855ee6dd7b7a62726719ac07"} err="failed to get container status \"a0778be5b3c9f9989c619a1554af89870e141431855ee6dd7b7a62726719ac07\": rpc error: code = NotFound desc = could not find container \"a0778be5b3c9f9989c619a1554af89870e141431855ee6dd7b7a62726719ac07\": container with ID starting with a0778be5b3c9f9989c619a1554af89870e141431855ee6dd7b7a62726719ac07 not found: ID does not exist" Feb 25 16:47:15 crc kubenswrapper[4937]: I0225 16:47:15.380911 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a53ecf-fa28-4628-bece-dfd41abb0ed6" path="/var/lib/kubelet/pods/49a53ecf-fa28-4628-bece-dfd41abb0ed6/volumes" Feb 25 16:47:23 crc kubenswrapper[4937]: I0225 16:47:23.864173 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4rckv"] Feb 25 16:47:23 crc kubenswrapper[4937]: E0225 16:47:23.865123 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a53ecf-fa28-4628-bece-dfd41abb0ed6" containerName="extract-content" Feb 25 16:47:23 crc kubenswrapper[4937]: I0225 16:47:23.865136 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a53ecf-fa28-4628-bece-dfd41abb0ed6" containerName="extract-content" Feb 25 16:47:23 crc kubenswrapper[4937]: E0225 16:47:23.865147 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a53ecf-fa28-4628-bece-dfd41abb0ed6" containerName="registry-server" Feb 25 16:47:23 crc kubenswrapper[4937]: I0225 16:47:23.865153 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a53ecf-fa28-4628-bece-dfd41abb0ed6" containerName="registry-server" Feb 25 16:47:23 crc kubenswrapper[4937]: E0225 16:47:23.865188 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a53ecf-fa28-4628-bece-dfd41abb0ed6" containerName="extract-utilities" Feb 25 16:47:23 crc kubenswrapper[4937]: I0225 16:47:23.865194 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a53ecf-fa28-4628-bece-dfd41abb0ed6" containerName="extract-utilities" Feb 25 16:47:23 crc kubenswrapper[4937]: I0225 16:47:23.865375 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a53ecf-fa28-4628-bece-dfd41abb0ed6" containerName="registry-server" Feb 25 16:47:23 crc kubenswrapper[4937]: I0225 16:47:23.866895 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:23 crc kubenswrapper[4937]: I0225 16:47:23.891597 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rckv"] Feb 25 16:47:24 crc kubenswrapper[4937]: I0225 16:47:24.033635 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm5r9\" (UniqueName: \"kubernetes.io/projected/712d068b-5e64-41fe-bf7b-839866d10ba9-kube-api-access-gm5r9\") pod \"certified-operators-4rckv\" (UID: \"712d068b-5e64-41fe-bf7b-839866d10ba9\") " pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:24 crc kubenswrapper[4937]: I0225 16:47:24.033713 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712d068b-5e64-41fe-bf7b-839866d10ba9-utilities\") pod \"certified-operators-4rckv\" (UID: \"712d068b-5e64-41fe-bf7b-839866d10ba9\") " pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:24 crc kubenswrapper[4937]: I0225 16:47:24.033824 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712d068b-5e64-41fe-bf7b-839866d10ba9-catalog-content\") pod \"certified-operators-4rckv\" (UID: \"712d068b-5e64-41fe-bf7b-839866d10ba9\") " pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:24 crc kubenswrapper[4937]: I0225 16:47:24.135552 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712d068b-5e64-41fe-bf7b-839866d10ba9-catalog-content\") pod \"certified-operators-4rckv\" (UID: \"712d068b-5e64-41fe-bf7b-839866d10ba9\") " pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:24 crc kubenswrapper[4937]: I0225 16:47:24.135684 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm5r9\" (UniqueName: \"kubernetes.io/projected/712d068b-5e64-41fe-bf7b-839866d10ba9-kube-api-access-gm5r9\") pod \"certified-operators-4rckv\" (UID: \"712d068b-5e64-41fe-bf7b-839866d10ba9\") " pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:24 crc kubenswrapper[4937]: I0225 16:47:24.135726 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712d068b-5e64-41fe-bf7b-839866d10ba9-utilities\") pod \"certified-operators-4rckv\" (UID: \"712d068b-5e64-41fe-bf7b-839866d10ba9\") " pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:24 crc kubenswrapper[4937]: I0225 16:47:24.136091 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712d068b-5e64-41fe-bf7b-839866d10ba9-catalog-content\") pod \"certified-operators-4rckv\" (UID: \"712d068b-5e64-41fe-bf7b-839866d10ba9\") " pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:24 crc kubenswrapper[4937]: I0225 16:47:24.136131 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712d068b-5e64-41fe-bf7b-839866d10ba9-utilities\") pod \"certified-operators-4rckv\" (UID: \"712d068b-5e64-41fe-bf7b-839866d10ba9\") " pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:24 crc kubenswrapper[4937]: I0225 16:47:24.153869 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm5r9\" (UniqueName: \"kubernetes.io/projected/712d068b-5e64-41fe-bf7b-839866d10ba9-kube-api-access-gm5r9\") pod \"certified-operators-4rckv\" (UID: \"712d068b-5e64-41fe-bf7b-839866d10ba9\") " pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:24 crc kubenswrapper[4937]: I0225 16:47:24.183186 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:24 crc kubenswrapper[4937]: I0225 16:47:24.653618 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rckv"] Feb 25 16:47:25 crc kubenswrapper[4937]: I0225 16:47:25.607058 4937 generic.go:334] "Generic (PLEG): container finished" podID="712d068b-5e64-41fe-bf7b-839866d10ba9" containerID="a1406d5dc6942c0e9fac03b88f63666da8fd0b82a473dc1c2220175587fc932f" exitCode=0 Feb 25 16:47:25 crc kubenswrapper[4937]: I0225 16:47:25.607418 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rckv" event={"ID":"712d068b-5e64-41fe-bf7b-839866d10ba9","Type":"ContainerDied","Data":"a1406d5dc6942c0e9fac03b88f63666da8fd0b82a473dc1c2220175587fc932f"} Feb 25 16:47:25 crc kubenswrapper[4937]: I0225 16:47:25.607444 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rckv" event={"ID":"712d068b-5e64-41fe-bf7b-839866d10ba9","Type":"ContainerStarted","Data":"8563d974b9c039a3ed6ef87faa25a0da5a51856f6112db498b48791e01001bdd"} Feb 25 16:47:32 crc kubenswrapper[4937]: I0225 16:47:32.679250 4937 generic.go:334] "Generic (PLEG): container finished" podID="712d068b-5e64-41fe-bf7b-839866d10ba9" containerID="d8382276359df5faef5d3a88f2371f782f331cba717be1b2b7e425a66292c8fd" exitCode=0 Feb 25 16:47:32 crc kubenswrapper[4937]: I0225 16:47:32.679347 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rckv" event={"ID":"712d068b-5e64-41fe-bf7b-839866d10ba9","Type":"ContainerDied","Data":"d8382276359df5faef5d3a88f2371f782f331cba717be1b2b7e425a66292c8fd"} Feb 25 16:47:33 crc kubenswrapper[4937]: I0225 16:47:33.690347 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rckv" event={"ID":"712d068b-5e64-41fe-bf7b-839866d10ba9","Type":"ContainerStarted","Data":"d8e101dee576e23dea27d919f83a0376bad04f4c692c39b51d7c0fd65b4c3e36"} Feb 25 16:47:33 crc kubenswrapper[4937]: I0225 16:47:33.717786 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4rckv" podStartSLOduration=3.167589698 podStartE2EDuration="10.717770884s" podCreationTimestamp="2026-02-25 16:47:23 +0000 UTC" firstStartedPulling="2026-02-25 16:47:25.60952927 +0000 UTC m=+3696.622921160" lastFinishedPulling="2026-02-25 16:47:33.159710456 +0000 UTC m=+3704.173102346" observedRunningTime="2026-02-25 16:47:33.714376929 +0000 UTC m=+3704.727768819" watchObservedRunningTime="2026-02-25 16:47:33.717770884 +0000 UTC m=+3704.731162774" Feb 25 16:47:34 crc kubenswrapper[4937]: I0225 16:47:34.184072 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:34 crc kubenswrapper[4937]: I0225 16:47:34.184627 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:35 crc kubenswrapper[4937]: I0225 16:47:35.244974 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-4rckv" podUID="712d068b-5e64-41fe-bf7b-839866d10ba9" containerName="registry-server" probeResult="failure" output=< Feb 25 16:47:35 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 16:47:35 crc kubenswrapper[4937]: > Feb 25 16:47:38 crc kubenswrapper[4937]: I0225 16:47:38.018701 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dz785_9b1dc13b-9b02-42b0-a00e-21f15f9f98a2/control-plane-machine-set-operator/1.log" Feb 25 16:47:38 crc kubenswrapper[4937]: I0225 16:47:38.080016 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dz785_9b1dc13b-9b02-42b0-a00e-21f15f9f98a2/control-plane-machine-set-operator/0.log" Feb 25 16:47:38 crc kubenswrapper[4937]: I0225 16:47:38.289021 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8zn9j_7210df16-765e-4b49-8b67-8989f4b2f15c/kube-rbac-proxy/0.log" Feb 25 16:47:38 crc kubenswrapper[4937]: I0225 16:47:38.346173 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8zn9j_7210df16-765e-4b49-8b67-8989f4b2f15c/machine-api-operator/0.log" Feb 25 16:47:41 crc kubenswrapper[4937]: I0225 16:47:41.494949 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:47:41 crc kubenswrapper[4937]: I0225 16:47:41.495477 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:47:44 crc kubenswrapper[4937]: I0225 16:47:44.248518 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:44 crc kubenswrapper[4937]: I0225 16:47:44.337460 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4rckv" Feb 25 16:47:44 crc kubenswrapper[4937]: I0225 16:47:44.427946 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rckv"] Feb 25 16:47:44 crc kubenswrapper[4937]: I0225 16:47:44.670269 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r5xz8"] Feb 25 16:47:44 crc kubenswrapper[4937]: I0225 16:47:44.670522 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r5xz8" podUID="e5b80aa7-9a93-4c10-84a8-d6d12889d28e" containerName="registry-server" containerID="cri-o://c914b2f1783e89c419934bb732fb23fd3941ba6cd1066ed4169df2eb8aa2122d" gracePeriod=2 Feb 25 16:47:44 crc kubenswrapper[4937]: I0225 16:47:44.858270 4937 generic.go:334] "Generic (PLEG): container finished" podID="e5b80aa7-9a93-4c10-84a8-d6d12889d28e" containerID="c914b2f1783e89c419934bb732fb23fd3941ba6cd1066ed4169df2eb8aa2122d" exitCode=0 Feb 25 16:47:44 crc kubenswrapper[4937]: I0225 16:47:44.858384 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5xz8" event={"ID":"e5b80aa7-9a93-4c10-84a8-d6d12889d28e","Type":"ContainerDied","Data":"c914b2f1783e89c419934bb732fb23fd3941ba6cd1066ed4169df2eb8aa2122d"} Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.428401 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.520320 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-utilities\") pod \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\" (UID: \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\") " Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.520713 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clqxf\" (UniqueName: \"kubernetes.io/projected/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-kube-api-access-clqxf\") pod \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\" (UID: \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\") " Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.520875 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-catalog-content\") pod \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\" (UID: \"e5b80aa7-9a93-4c10-84a8-d6d12889d28e\") " Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.520960 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-utilities" (OuterVolumeSpecName: "utilities") pod "e5b80aa7-9a93-4c10-84a8-d6d12889d28e" (UID: "e5b80aa7-9a93-4c10-84a8-d6d12889d28e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.522031 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.542725 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-kube-api-access-clqxf" (OuterVolumeSpecName: "kube-api-access-clqxf") pod "e5b80aa7-9a93-4c10-84a8-d6d12889d28e" (UID: "e5b80aa7-9a93-4c10-84a8-d6d12889d28e"). InnerVolumeSpecName "kube-api-access-clqxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.569988 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5b80aa7-9a93-4c10-84a8-d6d12889d28e" (UID: "e5b80aa7-9a93-4c10-84a8-d6d12889d28e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.624586 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.624616 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clqxf\" (UniqueName: \"kubernetes.io/projected/e5b80aa7-9a93-4c10-84a8-d6d12889d28e-kube-api-access-clqxf\") on node \"crc\" DevicePath \"\"" Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.870307 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5xz8" Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.870301 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5xz8" event={"ID":"e5b80aa7-9a93-4c10-84a8-d6d12889d28e","Type":"ContainerDied","Data":"3cc4afaca4045f4240fe26acbcb04afa8cc7f5230a4b484afa08c0c3719dafb7"} Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.870382 4937 scope.go:117] "RemoveContainer" containerID="c914b2f1783e89c419934bb732fb23fd3941ba6cd1066ed4169df2eb8aa2122d" Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.896682 4937 scope.go:117] "RemoveContainer" containerID="df2fe4ef78694f38b225e031c16c88df880fe1419da34db3a6936946f536ec09" Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.925506 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r5xz8"] Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.925642 4937 scope.go:117] "RemoveContainer" containerID="1cdbbf4008de41aaa4a6ebff22164e5cd940f1a39a6abd791a56173ca61178da" Feb 25 16:47:45 crc kubenswrapper[4937]: I0225 16:47:45.941983 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r5xz8"] Feb 25 16:47:47 crc kubenswrapper[4937]: I0225 16:47:47.378170 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5b80aa7-9a93-4c10-84a8-d6d12889d28e" path="/var/lib/kubelet/pods/e5b80aa7-9a93-4c10-84a8-d6d12889d28e/volumes" Feb 25 16:47:54 crc kubenswrapper[4937]: I0225 16:47:54.389638 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-tfn2h_92b2442a-04d9-4377-bef2-958d8a72543f/cert-manager-controller/0.log" Feb 25 16:47:54 crc kubenswrapper[4937]: I0225 16:47:54.586756 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-p4vff_99c5f86d-7755-49b0-bb68-7e9a338dbca7/cert-manager-cainjector/0.log" Feb 25 16:47:54 crc kubenswrapper[4937]: I0225 16:47:54.710299 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-m4kc4_2e84eec9-8ff5-4f02-9596-e468e289dba0/cert-manager-webhook/0.log" Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.167007 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533968-z95bx"] Feb 25 16:48:00 crc kubenswrapper[4937]: E0225 16:48:00.167937 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b80aa7-9a93-4c10-84a8-d6d12889d28e" containerName="extract-utilities" Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.167955 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b80aa7-9a93-4c10-84a8-d6d12889d28e" containerName="extract-utilities" Feb 25 16:48:00 crc kubenswrapper[4937]: E0225 16:48:00.167986 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b80aa7-9a93-4c10-84a8-d6d12889d28e" containerName="extract-content" Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.167995 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b80aa7-9a93-4c10-84a8-d6d12889d28e" containerName="extract-content" Feb 25 16:48:00 crc kubenswrapper[4937]: E0225 16:48:00.168011 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b80aa7-9a93-4c10-84a8-d6d12889d28e" containerName="registry-server" Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.168020 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b80aa7-9a93-4c10-84a8-d6d12889d28e" containerName="registry-server" Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.168274 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5b80aa7-9a93-4c10-84a8-d6d12889d28e" containerName="registry-server" Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.168990 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533968-z95bx" Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.173940 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.176252 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.178013 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.190859 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533968-z95bx"] Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.326286 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tschr\" (UniqueName: \"kubernetes.io/projected/4d1be3d1-6300-4b05-817c-9681567a5f8d-kube-api-access-tschr\") pod \"auto-csr-approver-29533968-z95bx\" (UID: \"4d1be3d1-6300-4b05-817c-9681567a5f8d\") " pod="openshift-infra/auto-csr-approver-29533968-z95bx" Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.428293 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tschr\" (UniqueName: \"kubernetes.io/projected/4d1be3d1-6300-4b05-817c-9681567a5f8d-kube-api-access-tschr\") pod \"auto-csr-approver-29533968-z95bx\" (UID: \"4d1be3d1-6300-4b05-817c-9681567a5f8d\") " pod="openshift-infra/auto-csr-approver-29533968-z95bx" Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.451530 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tschr\" (UniqueName: \"kubernetes.io/projected/4d1be3d1-6300-4b05-817c-9681567a5f8d-kube-api-access-tschr\") pod \"auto-csr-approver-29533968-z95bx\" (UID: \"4d1be3d1-6300-4b05-817c-9681567a5f8d\") " pod="openshift-infra/auto-csr-approver-29533968-z95bx" Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.485934 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533968-z95bx" Feb 25 16:48:00 crc kubenswrapper[4937]: I0225 16:48:00.962990 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533968-z95bx"] Feb 25 16:48:01 crc kubenswrapper[4937]: I0225 16:48:01.025672 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533968-z95bx" event={"ID":"4d1be3d1-6300-4b05-817c-9681567a5f8d","Type":"ContainerStarted","Data":"ba8d9d95326ad076c66a07a4109be1514a991ae674867c5e5cc5da87fd57f014"} Feb 25 16:48:03 crc kubenswrapper[4937]: I0225 16:48:03.048118 4937 generic.go:334] "Generic (PLEG): container finished" podID="4d1be3d1-6300-4b05-817c-9681567a5f8d" containerID="928baeafa4e24ae587779daec1d9e7ea0e35351405dd46039e2ce6a0bb05cd46" exitCode=0 Feb 25 16:48:03 crc kubenswrapper[4937]: I0225 16:48:03.048225 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533968-z95bx" event={"ID":"4d1be3d1-6300-4b05-817c-9681567a5f8d","Type":"ContainerDied","Data":"928baeafa4e24ae587779daec1d9e7ea0e35351405dd46039e2ce6a0bb05cd46"} Feb 25 16:48:04 crc kubenswrapper[4937]: I0225 16:48:04.658604 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533968-z95bx" Feb 25 16:48:04 crc kubenswrapper[4937]: I0225 16:48:04.812781 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tschr\" (UniqueName: \"kubernetes.io/projected/4d1be3d1-6300-4b05-817c-9681567a5f8d-kube-api-access-tschr\") pod \"4d1be3d1-6300-4b05-817c-9681567a5f8d\" (UID: \"4d1be3d1-6300-4b05-817c-9681567a5f8d\") " Feb 25 16:48:04 crc kubenswrapper[4937]: I0225 16:48:04.822799 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d1be3d1-6300-4b05-817c-9681567a5f8d-kube-api-access-tschr" (OuterVolumeSpecName: "kube-api-access-tschr") pod "4d1be3d1-6300-4b05-817c-9681567a5f8d" (UID: "4d1be3d1-6300-4b05-817c-9681567a5f8d"). InnerVolumeSpecName "kube-api-access-tschr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:48:04 crc kubenswrapper[4937]: I0225 16:48:04.938551 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tschr\" (UniqueName: \"kubernetes.io/projected/4d1be3d1-6300-4b05-817c-9681567a5f8d-kube-api-access-tschr\") on node \"crc\" DevicePath \"\"" Feb 25 16:48:05 crc kubenswrapper[4937]: I0225 16:48:05.064130 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533968-z95bx" event={"ID":"4d1be3d1-6300-4b05-817c-9681567a5f8d","Type":"ContainerDied","Data":"ba8d9d95326ad076c66a07a4109be1514a991ae674867c5e5cc5da87fd57f014"} Feb 25 16:48:05 crc kubenswrapper[4937]: I0225 16:48:05.064170 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba8d9d95326ad076c66a07a4109be1514a991ae674867c5e5cc5da87fd57f014" Feb 25 16:48:05 crc kubenswrapper[4937]: I0225 16:48:05.064218 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533968-z95bx" Feb 25 16:48:05 crc kubenswrapper[4937]: I0225 16:48:05.736178 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533962-szqn9"] Feb 25 16:48:05 crc kubenswrapper[4937]: I0225 16:48:05.748017 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533962-szqn9"] Feb 25 16:48:07 crc kubenswrapper[4937]: I0225 16:48:07.379409 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90e245e2-fcd1-419a-b30f-e248e047cae5" path="/var/lib/kubelet/pods/90e245e2-fcd1-419a-b30f-e248e047cae5/volumes" Feb 25 16:48:10 crc kubenswrapper[4937]: I0225 16:48:10.091947 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-xdc54_ac54500d-8e21-4b21-bb07-9ac1daf6ad08/nmstate-console-plugin/0.log" Feb 25 16:48:10 crc kubenswrapper[4937]: I0225 16:48:10.233555 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-hb5qm_3c5b69b1-26a3-4de2-9d56-ffc97c64ddad/nmstate-handler/0.log" Feb 25 16:48:10 crc kubenswrapper[4937]: I0225 16:48:10.283965 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-gs4df_6ce93581-d0da-4acc-978d-4c7b936d736b/kube-rbac-proxy/0.log" Feb 25 16:48:10 crc kubenswrapper[4937]: I0225 16:48:10.332617 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-gs4df_6ce93581-d0da-4acc-978d-4c7b936d736b/nmstate-metrics/0.log" Feb 25 16:48:10 crc kubenswrapper[4937]: I0225 16:48:10.475219 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-c4wvk_086feb17-6360-4d8f-a766-78607300c491/nmstate-operator/0.log" Feb 25 16:48:10 crc kubenswrapper[4937]: I0225 16:48:10.541229 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-ljnzj_fee0f5ff-b02d-4a31-921b-e151949932d1/nmstate-webhook/0.log" Feb 25 16:48:11 crc kubenswrapper[4937]: I0225 16:48:11.494716 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:48:11 crc kubenswrapper[4937]: I0225 16:48:11.494780 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:48:25 crc kubenswrapper[4937]: I0225 16:48:25.571421 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-b5f46f5f7-zscl6_b72cc98b-e045-4ade-bdf7-c9929fc489fc/manager/0.log" Feb 25 16:48:25 crc kubenswrapper[4937]: I0225 16:48:25.581255 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-b5f46f5f7-zscl6_b72cc98b-e045-4ade-bdf7-c9929fc489fc/kube-rbac-proxy/0.log" Feb 25 16:48:39 crc kubenswrapper[4937]: I0225 16:48:39.858587 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-46sv9_4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af/prometheus-operator/0.log" Feb 25 16:48:40 crc kubenswrapper[4937]: I0225 16:48:40.032011 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_8f8315c1-97ca-4525-a1a8-afe98581f614/prometheus-operator-admission-webhook/0.log" Feb 25 16:48:40 crc kubenswrapper[4937]: I0225 16:48:40.084838 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_1cad8e2a-5182-4d59-9afa-c64ced98e87b/prometheus-operator-admission-webhook/0.log" Feb 25 16:48:40 crc kubenswrapper[4937]: I0225 16:48:40.312080 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-p5kbh_0eb822b0-826a-4b2d-9376-141a69ba37e5/operator/0.log" Feb 25 16:48:40 crc kubenswrapper[4937]: I0225 16:48:40.362914 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-prw69_26437cd5-3ce5-4d7a-9b7f-9f983015f74d/perses-operator/0.log" Feb 25 16:48:41 crc kubenswrapper[4937]: I0225 16:48:41.494843 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:48:41 crc kubenswrapper[4937]: I0225 16:48:41.495201 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:48:41 crc kubenswrapper[4937]: I0225 16:48:41.495249 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 16:48:41 crc kubenswrapper[4937]: I0225 16:48:41.496034 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 16:48:41 crc kubenswrapper[4937]: I0225 16:48:41.496087 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" gracePeriod=600 Feb 25 16:48:41 crc kubenswrapper[4937]: E0225 16:48:41.625945 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:48:42 crc kubenswrapper[4937]: I0225 16:48:42.461687 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" exitCode=0 Feb 25 16:48:42 crc kubenswrapper[4937]: I0225 16:48:42.461723 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd"} Feb 25 16:48:42 crc kubenswrapper[4937]: I0225 16:48:42.461990 4937 scope.go:117] "RemoveContainer" containerID="c523637845c15bc142c970843436a3634d3a2d1727208c0649d730f119f41f73" Feb 25 16:48:42 crc kubenswrapper[4937]: I0225 16:48:42.462599 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:48:42 crc kubenswrapper[4937]: E0225 16:48:42.462851 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:48:43 crc kubenswrapper[4937]: I0225 16:48:43.686437 4937 scope.go:117] "RemoveContainer" containerID="900bd544d6d636b7f5f1e4be8069ace7ca5db0366b511643af6129cb65a82a63" Feb 25 16:48:55 crc kubenswrapper[4937]: I0225 16:48:55.863931 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-5vzl9_53cf6067-7864-4449-9f64-2cf8181fec1d/kube-rbac-proxy/0.log" Feb 25 16:48:55 crc kubenswrapper[4937]: I0225 16:48:55.963905 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-5vzl9_53cf6067-7864-4449-9f64-2cf8181fec1d/controller/0.log" Feb 25 16:48:56 crc kubenswrapper[4937]: I0225 16:48:56.093558 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-frr-files/0.log" Feb 25 16:48:56 crc kubenswrapper[4937]: I0225 16:48:56.276797 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-reloader/0.log" Feb 25 16:48:56 crc kubenswrapper[4937]: I0225 16:48:56.308153 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-metrics/0.log" Feb 25 16:48:56 crc kubenswrapper[4937]: I0225 16:48:56.340536 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-frr-files/0.log" Feb 25 16:48:56 crc kubenswrapper[4937]: I0225 16:48:56.356478 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-reloader/0.log" Feb 25 16:48:56 crc kubenswrapper[4937]: I0225 16:48:56.490977 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-reloader/0.log" Feb 25 16:48:56 crc kubenswrapper[4937]: I0225 16:48:56.529380 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-frr-files/0.log" Feb 25 16:48:56 crc kubenswrapper[4937]: I0225 16:48:56.572004 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-metrics/0.log" Feb 25 16:48:56 crc kubenswrapper[4937]: I0225 16:48:56.649552 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-metrics/0.log" Feb 25 16:48:56 crc kubenswrapper[4937]: I0225 16:48:56.922156 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-reloader/0.log" Feb 25 16:48:56 crc kubenswrapper[4937]: I0225 16:48:56.933153 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-frr-files/0.log" Feb 25 16:48:56 crc kubenswrapper[4937]: I0225 16:48:56.962777 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-metrics/0.log" Feb 25 16:48:56 crc kubenswrapper[4937]: I0225 16:48:56.975716 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/controller/0.log" Feb 25 16:48:57 crc kubenswrapper[4937]: I0225 16:48:57.132598 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/frr-metrics/0.log" Feb 25 16:48:57 crc kubenswrapper[4937]: I0225 16:48:57.213932 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/kube-rbac-proxy-frr/0.log" Feb 25 16:48:57 crc kubenswrapper[4937]: I0225 16:48:57.262248 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/kube-rbac-proxy/0.log" Feb 25 16:48:57 crc kubenswrapper[4937]: I0225 16:48:57.330507 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/reloader/0.log" Feb 25 16:48:57 crc kubenswrapper[4937]: I0225 16:48:57.496345 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-zl6xj_f3b9485a-9a4f-467b-9e99-e858b7b47a8b/frr-k8s-webhook-server/0.log" Feb 25 16:48:57 crc kubenswrapper[4937]: I0225 16:48:57.697385 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-678f5df958-zlttq_8ad5751a-e32c-4f13-ab06-b3ddeb681961/manager/0.log" Feb 25 16:48:57 crc kubenswrapper[4937]: I0225 16:48:57.829330 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-59546f7477-2w52w_6faaefa8-4269-448f-90a9-b4af7b5b2eae/webhook-server/0.log" Feb 25 16:48:58 crc kubenswrapper[4937]: I0225 16:48:58.144909 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vpqx7_8c24e8b5-c791-4ceb-9258-fba04c4adf91/kube-rbac-proxy/0.log" Feb 25 16:48:58 crc kubenswrapper[4937]: I0225 16:48:58.368658 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:48:58 crc kubenswrapper[4937]: E0225 16:48:58.368875 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:48:58 crc kubenswrapper[4937]: I0225 16:48:58.567848 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/frr/0.log" Feb 25 16:48:58 crc kubenswrapper[4937]: I0225 16:48:58.602267 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vpqx7_8c24e8b5-c791-4ceb-9258-fba04c4adf91/speaker/0.log" Feb 25 16:49:12 crc kubenswrapper[4937]: I0225 16:49:12.734212 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k_42fd7b47-664a-4b65-8804-417a7fdd9b2f/util/0.log" Feb 25 16:49:12 crc kubenswrapper[4937]: I0225 16:49:12.963496 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k_42fd7b47-664a-4b65-8804-417a7fdd9b2f/util/0.log" Feb 25 16:49:12 crc kubenswrapper[4937]: I0225 16:49:12.993830 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k_42fd7b47-664a-4b65-8804-417a7fdd9b2f/pull/0.log" Feb 25 16:49:12 crc kubenswrapper[4937]: I0225 16:49:12.995153 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k_42fd7b47-664a-4b65-8804-417a7fdd9b2f/pull/0.log" Feb 25 16:49:13 crc kubenswrapper[4937]: I0225 16:49:13.210120 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k_42fd7b47-664a-4b65-8804-417a7fdd9b2f/pull/0.log" Feb 25 16:49:13 crc kubenswrapper[4937]: I0225 16:49:13.217228 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k_42fd7b47-664a-4b65-8804-417a7fdd9b2f/util/0.log" Feb 25 16:49:13 crc kubenswrapper[4937]: I0225 16:49:13.224933 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k_42fd7b47-664a-4b65-8804-417a7fdd9b2f/extract/0.log" Feb 25 16:49:13 crc kubenswrapper[4937]: I0225 16:49:13.365824 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv_47f85b62-41af-4e45-af61-33526ba0d867/util/0.log" Feb 25 16:49:13 crc kubenswrapper[4937]: I0225 16:49:13.368446 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:49:13 crc kubenswrapper[4937]: E0225 16:49:13.368789 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:49:13 crc kubenswrapper[4937]: I0225 16:49:13.565578 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv_47f85b62-41af-4e45-af61-33526ba0d867/util/0.log" Feb 25 16:49:13 crc kubenswrapper[4937]: I0225 16:49:13.612443 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv_47f85b62-41af-4e45-af61-33526ba0d867/pull/0.log" Feb 25 16:49:13 crc kubenswrapper[4937]: I0225 16:49:13.635918 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv_47f85b62-41af-4e45-af61-33526ba0d867/pull/0.log" Feb 25 16:49:13 crc kubenswrapper[4937]: I0225 16:49:13.810979 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv_47f85b62-41af-4e45-af61-33526ba0d867/util/0.log" Feb 25 16:49:13 crc kubenswrapper[4937]: I0225 16:49:13.816723 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv_47f85b62-41af-4e45-af61-33526ba0d867/pull/0.log" Feb 25 16:49:13 crc kubenswrapper[4937]: I0225 16:49:13.860965 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv_47f85b62-41af-4e45-af61-33526ba0d867/extract/0.log" Feb 25 16:49:13 crc kubenswrapper[4937]: I0225 16:49:13.981327 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7_6b58d852-ef69-4a94-8e1b-8892612ff7aa/util/0.log" Feb 25 16:49:14 crc kubenswrapper[4937]: I0225 16:49:14.201462 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7_6b58d852-ef69-4a94-8e1b-8892612ff7aa/pull/0.log" Feb 25 16:49:14 crc kubenswrapper[4937]: I0225 16:49:14.202863 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7_6b58d852-ef69-4a94-8e1b-8892612ff7aa/util/0.log" Feb 25 16:49:14 crc kubenswrapper[4937]: I0225 16:49:14.203590 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7_6b58d852-ef69-4a94-8e1b-8892612ff7aa/pull/0.log" Feb 25 16:49:14 crc kubenswrapper[4937]: I0225 16:49:14.425313 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7_6b58d852-ef69-4a94-8e1b-8892612ff7aa/util/0.log" Feb 25 16:49:14 crc kubenswrapper[4937]: I0225 16:49:14.426816 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7_6b58d852-ef69-4a94-8e1b-8892612ff7aa/pull/0.log" Feb 25 16:49:14 crc kubenswrapper[4937]: I0225 16:49:14.488307 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7_6b58d852-ef69-4a94-8e1b-8892612ff7aa/extract/0.log" Feb 25 16:49:14 crc kubenswrapper[4937]: I0225 16:49:14.632079 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4rckv_712d068b-5e64-41fe-bf7b-839866d10ba9/extract-utilities/0.log" Feb 25 16:49:14 crc kubenswrapper[4937]: I0225 16:49:14.775702 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4rckv_712d068b-5e64-41fe-bf7b-839866d10ba9/extract-utilities/0.log" Feb 25 16:49:14 crc kubenswrapper[4937]: I0225 16:49:14.810653 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4rckv_712d068b-5e64-41fe-bf7b-839866d10ba9/extract-content/0.log" Feb 25 16:49:14 crc kubenswrapper[4937]: I0225 16:49:14.867445 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4rckv_712d068b-5e64-41fe-bf7b-839866d10ba9/extract-content/0.log" Feb 25 16:49:15 crc kubenswrapper[4937]: I0225 16:49:15.092498 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4rckv_712d068b-5e64-41fe-bf7b-839866d10ba9/extract-content/0.log" Feb 25 16:49:15 crc kubenswrapper[4937]: I0225 16:49:15.124293 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4rckv_712d068b-5e64-41fe-bf7b-839866d10ba9/extract-utilities/0.log" Feb 25 16:49:15 crc kubenswrapper[4937]: I0225 16:49:15.262373 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4rckv_712d068b-5e64-41fe-bf7b-839866d10ba9/registry-server/0.log" Feb 25 16:49:15 crc kubenswrapper[4937]: I0225 16:49:15.320228 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6dqwg_18855c35-8e7b-4089-848f-e325b779dc51/extract-utilities/0.log" Feb 25 16:49:15 crc kubenswrapper[4937]: I0225 16:49:15.500057 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6dqwg_18855c35-8e7b-4089-848f-e325b779dc51/extract-utilities/0.log" Feb 25 16:49:15 crc kubenswrapper[4937]: I0225 16:49:15.520427 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6dqwg_18855c35-8e7b-4089-848f-e325b779dc51/extract-content/0.log" Feb 25 16:49:15 crc kubenswrapper[4937]: I0225 16:49:15.543176 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6dqwg_18855c35-8e7b-4089-848f-e325b779dc51/extract-content/0.log" Feb 25 16:49:15 crc kubenswrapper[4937]: I0225 16:49:15.714077 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6dqwg_18855c35-8e7b-4089-848f-e325b779dc51/extract-content/0.log" Feb 25 16:49:15 crc kubenswrapper[4937]: I0225 16:49:15.789531 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6dqwg_18855c35-8e7b-4089-848f-e325b779dc51/extract-utilities/0.log" Feb 25 16:49:16 crc kubenswrapper[4937]: I0225 16:49:16.076972 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw_c1d8e5a9-c042-4057-bda5-874d8f7fc926/util/0.log" Feb 25 16:49:16 crc kubenswrapper[4937]: I0225 16:49:16.312335 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6dqwg_18855c35-8e7b-4089-848f-e325b779dc51/registry-server/0.log" Feb 25 16:49:16 crc kubenswrapper[4937]: I0225 16:49:16.328336 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw_c1d8e5a9-c042-4057-bda5-874d8f7fc926/util/0.log" Feb 25 16:49:16 crc kubenswrapper[4937]: I0225 16:49:16.348739 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw_c1d8e5a9-c042-4057-bda5-874d8f7fc926/pull/0.log" Feb 25 16:49:16 crc kubenswrapper[4937]: I0225 16:49:16.370193 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw_c1d8e5a9-c042-4057-bda5-874d8f7fc926/pull/0.log" Feb 25 16:49:16 crc kubenswrapper[4937]: I0225 16:49:16.546312 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw_c1d8e5a9-c042-4057-bda5-874d8f7fc926/pull/0.log" Feb 25 16:49:16 crc kubenswrapper[4937]: I0225 16:49:16.580962 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw_c1d8e5a9-c042-4057-bda5-874d8f7fc926/util/0.log" Feb 25 16:49:16 crc kubenswrapper[4937]: I0225 16:49:16.615057 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw_c1d8e5a9-c042-4057-bda5-874d8f7fc926/extract/0.log" Feb 25 16:49:16 crc kubenswrapper[4937]: I0225 16:49:16.631586 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-nbj4m_44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6/marketplace-operator/0.log" Feb 25 16:49:16 crc kubenswrapper[4937]: I0225 16:49:16.736926 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8n9q4_22c34722-ce3c-4f34-9a65-3a8ccdbb0673/extract-utilities/0.log" Feb 25 16:49:16 crc kubenswrapper[4937]: I0225 16:49:16.993604 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8n9q4_22c34722-ce3c-4f34-9a65-3a8ccdbb0673/extract-content/0.log" Feb 25 16:49:17 crc kubenswrapper[4937]: I0225 16:49:17.006889 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8n9q4_22c34722-ce3c-4f34-9a65-3a8ccdbb0673/extract-content/0.log" Feb 25 16:49:17 crc kubenswrapper[4937]: I0225 16:49:17.030579 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8n9q4_22c34722-ce3c-4f34-9a65-3a8ccdbb0673/extract-utilities/0.log" Feb 25 16:49:17 crc kubenswrapper[4937]: I0225 16:49:17.225187 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8n9q4_22c34722-ce3c-4f34-9a65-3a8ccdbb0673/extract-content/0.log" Feb 25 16:49:17 crc kubenswrapper[4937]: I0225 16:49:17.273395 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fswxh_d1d5578b-cf1d-4208-91b2-2019dff70a16/extract-utilities/0.log" Feb 25 16:49:17 crc kubenswrapper[4937]: I0225 16:49:17.280330 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8n9q4_22c34722-ce3c-4f34-9a65-3a8ccdbb0673/extract-utilities/0.log" Feb 25 16:49:17 crc kubenswrapper[4937]: I0225 16:49:17.346975 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8n9q4_22c34722-ce3c-4f34-9a65-3a8ccdbb0673/registry-server/0.log" Feb 25 16:49:17 crc kubenswrapper[4937]: I0225 16:49:17.553917 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fswxh_d1d5578b-cf1d-4208-91b2-2019dff70a16/extract-content/0.log" Feb 25 16:49:17 crc kubenswrapper[4937]: I0225 16:49:17.572481 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fswxh_d1d5578b-cf1d-4208-91b2-2019dff70a16/extract-utilities/0.log" Feb 25 16:49:17 crc kubenswrapper[4937]: I0225 16:49:17.572534 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fswxh_d1d5578b-cf1d-4208-91b2-2019dff70a16/extract-content/0.log" Feb 25 16:49:17 crc kubenswrapper[4937]: I0225 16:49:17.741092 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fswxh_d1d5578b-cf1d-4208-91b2-2019dff70a16/extract-utilities/0.log" Feb 25 16:49:17 crc kubenswrapper[4937]: I0225 16:49:17.767099 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fswxh_d1d5578b-cf1d-4208-91b2-2019dff70a16/extract-content/0.log" Feb 25 16:49:18 crc kubenswrapper[4937]: I0225 16:49:18.206230 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fswxh_d1d5578b-cf1d-4208-91b2-2019dff70a16/registry-server/0.log" Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.368497 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:49:26 crc kubenswrapper[4937]: E0225 16:49:26.369344 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.667527 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-stnj6"] Feb 25 16:49:26 crc kubenswrapper[4937]: E0225 16:49:26.668111 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1be3d1-6300-4b05-817c-9681567a5f8d" containerName="oc" Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.668128 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1be3d1-6300-4b05-817c-9681567a5f8d" containerName="oc" Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.668335 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1be3d1-6300-4b05-817c-9681567a5f8d" containerName="oc" Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.669854 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.693676 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-stnj6"] Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.823248 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5766fa18-7a2c-44b1-97e2-eafce2097b00-catalog-content\") pod \"redhat-marketplace-stnj6\" (UID: \"5766fa18-7a2c-44b1-97e2-eafce2097b00\") " pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.823382 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5766fa18-7a2c-44b1-97e2-eafce2097b00-utilities\") pod \"redhat-marketplace-stnj6\" (UID: \"5766fa18-7a2c-44b1-97e2-eafce2097b00\") " pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.823411 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvg97\" (UniqueName: \"kubernetes.io/projected/5766fa18-7a2c-44b1-97e2-eafce2097b00-kube-api-access-gvg97\") pod \"redhat-marketplace-stnj6\" (UID: \"5766fa18-7a2c-44b1-97e2-eafce2097b00\") " pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.927272 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5766fa18-7a2c-44b1-97e2-eafce2097b00-catalog-content\") pod \"redhat-marketplace-stnj6\" (UID: \"5766fa18-7a2c-44b1-97e2-eafce2097b00\") " pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.927860 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5766fa18-7a2c-44b1-97e2-eafce2097b00-catalog-content\") pod \"redhat-marketplace-stnj6\" (UID: \"5766fa18-7a2c-44b1-97e2-eafce2097b00\") " pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.927970 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5766fa18-7a2c-44b1-97e2-eafce2097b00-utilities\") pod \"redhat-marketplace-stnj6\" (UID: \"5766fa18-7a2c-44b1-97e2-eafce2097b00\") " pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.928006 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5766fa18-7a2c-44b1-97e2-eafce2097b00-utilities\") pod \"redhat-marketplace-stnj6\" (UID: \"5766fa18-7a2c-44b1-97e2-eafce2097b00\") " pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.928043 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvg97\" (UniqueName: \"kubernetes.io/projected/5766fa18-7a2c-44b1-97e2-eafce2097b00-kube-api-access-gvg97\") pod \"redhat-marketplace-stnj6\" (UID: \"5766fa18-7a2c-44b1-97e2-eafce2097b00\") " pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:26 crc kubenswrapper[4937]: I0225 16:49:26.949378 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvg97\" (UniqueName: \"kubernetes.io/projected/5766fa18-7a2c-44b1-97e2-eafce2097b00-kube-api-access-gvg97\") pod \"redhat-marketplace-stnj6\" (UID: \"5766fa18-7a2c-44b1-97e2-eafce2097b00\") " pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:27 crc kubenswrapper[4937]: I0225 16:49:27.000718 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:27 crc kubenswrapper[4937]: I0225 16:49:27.552794 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-stnj6"] Feb 25 16:49:27 crc kubenswrapper[4937]: I0225 16:49:27.925782 4937 generic.go:334] "Generic (PLEG): container finished" podID="5766fa18-7a2c-44b1-97e2-eafce2097b00" containerID="494f7d7e71bde9a155116100e01f3fd306828caac74a4b0bab70ee74e9bd225e" exitCode=0 Feb 25 16:49:27 crc kubenswrapper[4937]: I0225 16:49:27.925829 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-stnj6" event={"ID":"5766fa18-7a2c-44b1-97e2-eafce2097b00","Type":"ContainerDied","Data":"494f7d7e71bde9a155116100e01f3fd306828caac74a4b0bab70ee74e9bd225e"} Feb 25 16:49:27 crc kubenswrapper[4937]: I0225 16:49:27.925854 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-stnj6" event={"ID":"5766fa18-7a2c-44b1-97e2-eafce2097b00","Type":"ContainerStarted","Data":"67b79297492c891f88807a38bb63772ba712256106f47b226f6ab83321c519a3"} Feb 25 16:49:27 crc kubenswrapper[4937]: I0225 16:49:27.928198 4937 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 16:49:28 crc kubenswrapper[4937]: I0225 16:49:28.937084 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-stnj6" event={"ID":"5766fa18-7a2c-44b1-97e2-eafce2097b00","Type":"ContainerStarted","Data":"5fe2104d9328e1920cc1f5eae69cd34ae38610fbe07e136fe9297eaad71f837a"} Feb 25 16:49:29 crc kubenswrapper[4937]: I0225 16:49:29.949051 4937 generic.go:334] "Generic (PLEG): container finished" podID="5766fa18-7a2c-44b1-97e2-eafce2097b00" containerID="5fe2104d9328e1920cc1f5eae69cd34ae38610fbe07e136fe9297eaad71f837a" exitCode=0 Feb 25 16:49:29 crc kubenswrapper[4937]: I0225 16:49:29.949171 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-stnj6" event={"ID":"5766fa18-7a2c-44b1-97e2-eafce2097b00","Type":"ContainerDied","Data":"5fe2104d9328e1920cc1f5eae69cd34ae38610fbe07e136fe9297eaad71f837a"} Feb 25 16:49:30 crc kubenswrapper[4937]: I0225 16:49:30.960546 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-stnj6" event={"ID":"5766fa18-7a2c-44b1-97e2-eafce2097b00","Type":"ContainerStarted","Data":"ecf760f9100e981bbcc99ce4953a8e0b825ba896bdaa4d46601dc99223a69f03"} Feb 25 16:49:30 crc kubenswrapper[4937]: I0225 16:49:30.991787 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-stnj6" podStartSLOduration=2.588851801 podStartE2EDuration="4.991764786s" podCreationTimestamp="2026-02-25 16:49:26 +0000 UTC" firstStartedPulling="2026-02-25 16:49:27.927794302 +0000 UTC m=+3818.941186212" lastFinishedPulling="2026-02-25 16:49:30.330707307 +0000 UTC m=+3821.344099197" observedRunningTime="2026-02-25 16:49:30.984242678 +0000 UTC m=+3821.997634568" watchObservedRunningTime="2026-02-25 16:49:30.991764786 +0000 UTC m=+3822.005156676" Feb 25 16:49:31 crc kubenswrapper[4937]: I0225 16:49:31.959416 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_8f8315c1-97ca-4525-a1a8-afe98581f614/prometheus-operator-admission-webhook/0.log" Feb 25 16:49:31 crc kubenswrapper[4937]: I0225 16:49:31.974290 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_1cad8e2a-5182-4d59-9afa-c64ced98e87b/prometheus-operator-admission-webhook/0.log" Feb 25 16:49:32 crc kubenswrapper[4937]: I0225 16:49:32.016049 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-46sv9_4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af/prometheus-operator/0.log" Feb 25 16:49:32 crc kubenswrapper[4937]: I0225 16:49:32.214022 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-prw69_26437cd5-3ce5-4d7a-9b7f-9f983015f74d/perses-operator/0.log" Feb 25 16:49:32 crc kubenswrapper[4937]: I0225 16:49:32.237067 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-p5kbh_0eb822b0-826a-4b2d-9376-141a69ba37e5/operator/0.log" Feb 25 16:49:37 crc kubenswrapper[4937]: I0225 16:49:37.001035 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:37 crc kubenswrapper[4937]: I0225 16:49:37.001317 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:37 crc kubenswrapper[4937]: I0225 16:49:37.060331 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:37 crc kubenswrapper[4937]: I0225 16:49:37.114951 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:37 crc kubenswrapper[4937]: I0225 16:49:37.304122 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-stnj6"] Feb 25 16:49:37 crc kubenswrapper[4937]: I0225 16:49:37.369006 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:49:37 crc kubenswrapper[4937]: E0225 16:49:37.369804 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:49:39 crc kubenswrapper[4937]: I0225 16:49:39.025963 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-stnj6" podUID="5766fa18-7a2c-44b1-97e2-eafce2097b00" containerName="registry-server" containerID="cri-o://ecf760f9100e981bbcc99ce4953a8e0b825ba896bdaa4d46601dc99223a69f03" gracePeriod=2 Feb 25 16:49:39 crc kubenswrapper[4937]: I0225 16:49:39.743610 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:39 crc kubenswrapper[4937]: I0225 16:49:39.795665 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvg97\" (UniqueName: \"kubernetes.io/projected/5766fa18-7a2c-44b1-97e2-eafce2097b00-kube-api-access-gvg97\") pod \"5766fa18-7a2c-44b1-97e2-eafce2097b00\" (UID: \"5766fa18-7a2c-44b1-97e2-eafce2097b00\") " Feb 25 16:49:39 crc kubenswrapper[4937]: I0225 16:49:39.795719 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5766fa18-7a2c-44b1-97e2-eafce2097b00-utilities\") pod \"5766fa18-7a2c-44b1-97e2-eafce2097b00\" (UID: \"5766fa18-7a2c-44b1-97e2-eafce2097b00\") " Feb 25 16:49:39 crc kubenswrapper[4937]: I0225 16:49:39.795764 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5766fa18-7a2c-44b1-97e2-eafce2097b00-catalog-content\") pod \"5766fa18-7a2c-44b1-97e2-eafce2097b00\" (UID: \"5766fa18-7a2c-44b1-97e2-eafce2097b00\") " Feb 25 16:49:39 crc kubenswrapper[4937]: I0225 16:49:39.797143 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5766fa18-7a2c-44b1-97e2-eafce2097b00-utilities" (OuterVolumeSpecName: "utilities") pod "5766fa18-7a2c-44b1-97e2-eafce2097b00" (UID: "5766fa18-7a2c-44b1-97e2-eafce2097b00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:49:39 crc kubenswrapper[4937]: I0225 16:49:39.808927 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5766fa18-7a2c-44b1-97e2-eafce2097b00-kube-api-access-gvg97" (OuterVolumeSpecName: "kube-api-access-gvg97") pod "5766fa18-7a2c-44b1-97e2-eafce2097b00" (UID: "5766fa18-7a2c-44b1-97e2-eafce2097b00"). InnerVolumeSpecName "kube-api-access-gvg97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:49:39 crc kubenswrapper[4937]: I0225 16:49:39.818079 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5766fa18-7a2c-44b1-97e2-eafce2097b00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5766fa18-7a2c-44b1-97e2-eafce2097b00" (UID: "5766fa18-7a2c-44b1-97e2-eafce2097b00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:49:39 crc kubenswrapper[4937]: I0225 16:49:39.900817 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvg97\" (UniqueName: \"kubernetes.io/projected/5766fa18-7a2c-44b1-97e2-eafce2097b00-kube-api-access-gvg97\") on node \"crc\" DevicePath \"\"" Feb 25 16:49:39 crc kubenswrapper[4937]: I0225 16:49:39.900882 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5766fa18-7a2c-44b1-97e2-eafce2097b00-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:49:39 crc kubenswrapper[4937]: I0225 16:49:39.900896 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5766fa18-7a2c-44b1-97e2-eafce2097b00-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.042835 4937 generic.go:334] "Generic (PLEG): container finished" podID="5766fa18-7a2c-44b1-97e2-eafce2097b00" containerID="ecf760f9100e981bbcc99ce4953a8e0b825ba896bdaa4d46601dc99223a69f03" exitCode=0 Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.042918 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-stnj6" Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.042907 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-stnj6" event={"ID":"5766fa18-7a2c-44b1-97e2-eafce2097b00","Type":"ContainerDied","Data":"ecf760f9100e981bbcc99ce4953a8e0b825ba896bdaa4d46601dc99223a69f03"} Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.043963 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-stnj6" event={"ID":"5766fa18-7a2c-44b1-97e2-eafce2097b00","Type":"ContainerDied","Data":"67b79297492c891f88807a38bb63772ba712256106f47b226f6ab83321c519a3"} Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.044027 4937 scope.go:117] "RemoveContainer" containerID="ecf760f9100e981bbcc99ce4953a8e0b825ba896bdaa4d46601dc99223a69f03" Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.085313 4937 scope.go:117] "RemoveContainer" containerID="5fe2104d9328e1920cc1f5eae69cd34ae38610fbe07e136fe9297eaad71f837a" Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.087127 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-stnj6"] Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.096983 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-stnj6"] Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.118181 4937 scope.go:117] "RemoveContainer" containerID="494f7d7e71bde9a155116100e01f3fd306828caac74a4b0bab70ee74e9bd225e" Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.147338 4937 scope.go:117] "RemoveContainer" containerID="ecf760f9100e981bbcc99ce4953a8e0b825ba896bdaa4d46601dc99223a69f03" Feb 25 16:49:40 crc kubenswrapper[4937]: E0225 16:49:40.147821 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecf760f9100e981bbcc99ce4953a8e0b825ba896bdaa4d46601dc99223a69f03\": container with ID starting with ecf760f9100e981bbcc99ce4953a8e0b825ba896bdaa4d46601dc99223a69f03 not found: ID does not exist" containerID="ecf760f9100e981bbcc99ce4953a8e0b825ba896bdaa4d46601dc99223a69f03" Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.147854 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecf760f9100e981bbcc99ce4953a8e0b825ba896bdaa4d46601dc99223a69f03"} err="failed to get container status \"ecf760f9100e981bbcc99ce4953a8e0b825ba896bdaa4d46601dc99223a69f03\": rpc error: code = NotFound desc = could not find container \"ecf760f9100e981bbcc99ce4953a8e0b825ba896bdaa4d46601dc99223a69f03\": container with ID starting with ecf760f9100e981bbcc99ce4953a8e0b825ba896bdaa4d46601dc99223a69f03 not found: ID does not exist" Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.147879 4937 scope.go:117] "RemoveContainer" containerID="5fe2104d9328e1920cc1f5eae69cd34ae38610fbe07e136fe9297eaad71f837a" Feb 25 16:49:40 crc kubenswrapper[4937]: E0225 16:49:40.148204 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fe2104d9328e1920cc1f5eae69cd34ae38610fbe07e136fe9297eaad71f837a\": container with ID starting with 5fe2104d9328e1920cc1f5eae69cd34ae38610fbe07e136fe9297eaad71f837a not found: ID does not exist" containerID="5fe2104d9328e1920cc1f5eae69cd34ae38610fbe07e136fe9297eaad71f837a" Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.148329 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fe2104d9328e1920cc1f5eae69cd34ae38610fbe07e136fe9297eaad71f837a"} err="failed to get container status \"5fe2104d9328e1920cc1f5eae69cd34ae38610fbe07e136fe9297eaad71f837a\": rpc error: code = NotFound desc = could not find container \"5fe2104d9328e1920cc1f5eae69cd34ae38610fbe07e136fe9297eaad71f837a\": container with ID starting with 5fe2104d9328e1920cc1f5eae69cd34ae38610fbe07e136fe9297eaad71f837a not found: ID does not exist" Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.148447 4937 scope.go:117] "RemoveContainer" containerID="494f7d7e71bde9a155116100e01f3fd306828caac74a4b0bab70ee74e9bd225e" Feb 25 16:49:40 crc kubenswrapper[4937]: E0225 16:49:40.148938 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"494f7d7e71bde9a155116100e01f3fd306828caac74a4b0bab70ee74e9bd225e\": container with ID starting with 494f7d7e71bde9a155116100e01f3fd306828caac74a4b0bab70ee74e9bd225e not found: ID does not exist" containerID="494f7d7e71bde9a155116100e01f3fd306828caac74a4b0bab70ee74e9bd225e" Feb 25 16:49:40 crc kubenswrapper[4937]: I0225 16:49:40.149010 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494f7d7e71bde9a155116100e01f3fd306828caac74a4b0bab70ee74e9bd225e"} err="failed to get container status \"494f7d7e71bde9a155116100e01f3fd306828caac74a4b0bab70ee74e9bd225e\": rpc error: code = NotFound desc = could not find container \"494f7d7e71bde9a155116100e01f3fd306828caac74a4b0bab70ee74e9bd225e\": container with ID starting with 494f7d7e71bde9a155116100e01f3fd306828caac74a4b0bab70ee74e9bd225e not found: ID does not exist" Feb 25 16:49:41 crc kubenswrapper[4937]: I0225 16:49:41.378355 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5766fa18-7a2c-44b1-97e2-eafce2097b00" path="/var/lib/kubelet/pods/5766fa18-7a2c-44b1-97e2-eafce2097b00/volumes" Feb 25 16:49:46 crc kubenswrapper[4937]: I0225 16:49:46.677437 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-b5f46f5f7-zscl6_b72cc98b-e045-4ade-bdf7-c9929fc489fc/kube-rbac-proxy/0.log" Feb 25 16:49:46 crc kubenswrapper[4937]: I0225 16:49:46.737176 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-b5f46f5f7-zscl6_b72cc98b-e045-4ade-bdf7-c9929fc489fc/manager/0.log" Feb 25 16:49:51 crc kubenswrapper[4937]: I0225 16:49:51.373762 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:49:51 crc kubenswrapper[4937]: E0225 16:49:51.374413 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:50:00 crc kubenswrapper[4937]: I0225 16:50:00.173157 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533970-v9vrz"] Feb 25 16:50:00 crc kubenswrapper[4937]: E0225 16:50:00.174183 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5766fa18-7a2c-44b1-97e2-eafce2097b00" containerName="registry-server" Feb 25 16:50:00 crc kubenswrapper[4937]: I0225 16:50:00.174200 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="5766fa18-7a2c-44b1-97e2-eafce2097b00" containerName="registry-server" Feb 25 16:50:00 crc kubenswrapper[4937]: E0225 16:50:00.174235 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5766fa18-7a2c-44b1-97e2-eafce2097b00" containerName="extract-utilities" Feb 25 16:50:00 crc kubenswrapper[4937]: I0225 16:50:00.174241 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="5766fa18-7a2c-44b1-97e2-eafce2097b00" containerName="extract-utilities" Feb 25 16:50:00 crc kubenswrapper[4937]: E0225 16:50:00.174253 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5766fa18-7a2c-44b1-97e2-eafce2097b00" containerName="extract-content" Feb 25 16:50:00 crc kubenswrapper[4937]: I0225 16:50:00.174260 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="5766fa18-7a2c-44b1-97e2-eafce2097b00" containerName="extract-content" Feb 25 16:50:00 crc kubenswrapper[4937]: I0225 16:50:00.174661 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="5766fa18-7a2c-44b1-97e2-eafce2097b00" containerName="registry-server" Feb 25 16:50:00 crc kubenswrapper[4937]: I0225 16:50:00.175371 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533970-v9vrz" Feb 25 16:50:00 crc kubenswrapper[4937]: I0225 16:50:00.177768 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:50:00 crc kubenswrapper[4937]: I0225 16:50:00.184884 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533970-v9vrz"] Feb 25 16:50:00 crc kubenswrapper[4937]: I0225 16:50:00.187420 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:50:00 crc kubenswrapper[4937]: I0225 16:50:00.188361 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:50:00 crc kubenswrapper[4937]: I0225 16:50:00.233647 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmwz8\" (UniqueName: \"kubernetes.io/projected/e02b8833-357e-4e00-b36c-037e417e3acf-kube-api-access-nmwz8\") pod \"auto-csr-approver-29533970-v9vrz\" (UID: \"e02b8833-357e-4e00-b36c-037e417e3acf\") " pod="openshift-infra/auto-csr-approver-29533970-v9vrz" Feb 25 16:50:00 crc kubenswrapper[4937]: I0225 16:50:00.336043 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmwz8\" (UniqueName: \"kubernetes.io/projected/e02b8833-357e-4e00-b36c-037e417e3acf-kube-api-access-nmwz8\") pod \"auto-csr-approver-29533970-v9vrz\" (UID: \"e02b8833-357e-4e00-b36c-037e417e3acf\") " pod="openshift-infra/auto-csr-approver-29533970-v9vrz" Feb 25 16:50:00 crc kubenswrapper[4937]: I0225 16:50:00.372981 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmwz8\" (UniqueName: \"kubernetes.io/projected/e02b8833-357e-4e00-b36c-037e417e3acf-kube-api-access-nmwz8\") pod \"auto-csr-approver-29533970-v9vrz\" (UID: \"e02b8833-357e-4e00-b36c-037e417e3acf\") " pod="openshift-infra/auto-csr-approver-29533970-v9vrz" Feb 25 16:50:00 crc kubenswrapper[4937]: I0225 16:50:00.494136 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533970-v9vrz" Feb 25 16:50:01 crc kubenswrapper[4937]: I0225 16:50:01.445808 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533970-v9vrz"] Feb 25 16:50:02 crc kubenswrapper[4937]: I0225 16:50:02.304676 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533970-v9vrz" event={"ID":"e02b8833-357e-4e00-b36c-037e417e3acf","Type":"ContainerStarted","Data":"cfdb367f79152aa567a73b6dec3a2ac9da60c03b1c25381d867dffed1b49c234"} Feb 25 16:50:04 crc kubenswrapper[4937]: I0225 16:50:04.323560 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533970-v9vrz" event={"ID":"e02b8833-357e-4e00-b36c-037e417e3acf","Type":"ContainerStarted","Data":"29cb8ae0c5f659341c1483f0355d886968e5310df8559408592bec347aeaa2fb"} Feb 25 16:50:04 crc kubenswrapper[4937]: I0225 16:50:04.342181 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533970-v9vrz" podStartSLOduration=2.079947362 podStartE2EDuration="4.342161653s" podCreationTimestamp="2026-02-25 16:50:00 +0000 UTC" firstStartedPulling="2026-02-25 16:50:01.457414586 +0000 UTC m=+3852.470806476" lastFinishedPulling="2026-02-25 16:50:03.719628877 +0000 UTC m=+3854.733020767" observedRunningTime="2026-02-25 16:50:04.339981808 +0000 UTC m=+3855.353373698" watchObservedRunningTime="2026-02-25 16:50:04.342161653 +0000 UTC m=+3855.355553543" Feb 25 16:50:04 crc kubenswrapper[4937]: I0225 16:50:04.367437 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:50:04 crc kubenswrapper[4937]: E0225 16:50:04.367715 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:50:05 crc kubenswrapper[4937]: I0225 16:50:05.348544 4937 generic.go:334] "Generic (PLEG): container finished" podID="e02b8833-357e-4e00-b36c-037e417e3acf" containerID="29cb8ae0c5f659341c1483f0355d886968e5310df8559408592bec347aeaa2fb" exitCode=0 Feb 25 16:50:05 crc kubenswrapper[4937]: I0225 16:50:05.348738 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533970-v9vrz" event={"ID":"e02b8833-357e-4e00-b36c-037e417e3acf","Type":"ContainerDied","Data":"29cb8ae0c5f659341c1483f0355d886968e5310df8559408592bec347aeaa2fb"} Feb 25 16:50:07 crc kubenswrapper[4937]: I0225 16:50:07.092704 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533970-v9vrz" Feb 25 16:50:07 crc kubenswrapper[4937]: I0225 16:50:07.213007 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmwz8\" (UniqueName: \"kubernetes.io/projected/e02b8833-357e-4e00-b36c-037e417e3acf-kube-api-access-nmwz8\") pod \"e02b8833-357e-4e00-b36c-037e417e3acf\" (UID: \"e02b8833-357e-4e00-b36c-037e417e3acf\") " Feb 25 16:50:07 crc kubenswrapper[4937]: I0225 16:50:07.219584 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e02b8833-357e-4e00-b36c-037e417e3acf-kube-api-access-nmwz8" (OuterVolumeSpecName: "kube-api-access-nmwz8") pod "e02b8833-357e-4e00-b36c-037e417e3acf" (UID: "e02b8833-357e-4e00-b36c-037e417e3acf"). InnerVolumeSpecName "kube-api-access-nmwz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:50:07 crc kubenswrapper[4937]: I0225 16:50:07.315796 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmwz8\" (UniqueName: \"kubernetes.io/projected/e02b8833-357e-4e00-b36c-037e417e3acf-kube-api-access-nmwz8\") on node \"crc\" DevicePath \"\"" Feb 25 16:50:07 crc kubenswrapper[4937]: I0225 16:50:07.373390 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533970-v9vrz" Feb 25 16:50:07 crc kubenswrapper[4937]: I0225 16:50:07.380180 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533970-v9vrz" event={"ID":"e02b8833-357e-4e00-b36c-037e417e3acf","Type":"ContainerDied","Data":"cfdb367f79152aa567a73b6dec3a2ac9da60c03b1c25381d867dffed1b49c234"} Feb 25 16:50:07 crc kubenswrapper[4937]: I0225 16:50:07.380231 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfdb367f79152aa567a73b6dec3a2ac9da60c03b1c25381d867dffed1b49c234" Feb 25 16:50:07 crc kubenswrapper[4937]: I0225 16:50:07.433307 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533964-rrcl8"] Feb 25 16:50:07 crc kubenswrapper[4937]: I0225 16:50:07.447419 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533964-rrcl8"] Feb 25 16:50:09 crc kubenswrapper[4937]: I0225 16:50:09.378743 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="261e5bc8-bfe5-4036-8599-19cd5c173b63" path="/var/lib/kubelet/pods/261e5bc8-bfe5-4036-8599-19cd5c173b63/volumes" Feb 25 16:50:15 crc kubenswrapper[4937]: I0225 16:50:15.368696 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:50:15 crc kubenswrapper[4937]: E0225 16:50:15.369768 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:50:26 crc kubenswrapper[4937]: I0225 16:50:26.369216 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:50:26 crc kubenswrapper[4937]: E0225 16:50:26.369861 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:50:40 crc kubenswrapper[4937]: I0225 16:50:40.367973 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:50:40 crc kubenswrapper[4937]: E0225 16:50:40.368618 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:50:43 crc kubenswrapper[4937]: I0225 16:50:43.788051 4937 scope.go:117] "RemoveContainer" containerID="5355803bb7fbcf681f764ce9972c98fb75aeef53730dbeb246a47805a76ab3b3" Feb 25 16:50:52 crc kubenswrapper[4937]: I0225 16:50:52.369905 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:50:52 crc kubenswrapper[4937]: E0225 16:50:52.371517 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:51:05 crc kubenswrapper[4937]: I0225 16:51:05.368717 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:51:05 crc kubenswrapper[4937]: E0225 16:51:05.369472 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:51:18 crc kubenswrapper[4937]: I0225 16:51:18.368218 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:51:18 crc kubenswrapper[4937]: E0225 16:51:18.368926 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:51:29 crc kubenswrapper[4937]: I0225 16:51:29.367274 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:51:29 crc kubenswrapper[4937]: E0225 16:51:29.368032 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:51:41 crc kubenswrapper[4937]: I0225 16:51:41.348825 4937 generic.go:334] "Generic (PLEG): container finished" podID="4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e" containerID="109a3de283c41cbb01b1aaadfaecfaa86d3ad9c4f7e00f1660b74dedaa323e5c" exitCode=0 Feb 25 16:51:41 crc kubenswrapper[4937]: I0225 16:51:41.348930 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jvrhp/must-gather-n2crw" event={"ID":"4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e","Type":"ContainerDied","Data":"109a3de283c41cbb01b1aaadfaecfaa86d3ad9c4f7e00f1660b74dedaa323e5c"} Feb 25 16:51:41 crc kubenswrapper[4937]: I0225 16:51:41.350286 4937 scope.go:117] "RemoveContainer" containerID="109a3de283c41cbb01b1aaadfaecfaa86d3ad9c4f7e00f1660b74dedaa323e5c" Feb 25 16:51:41 crc kubenswrapper[4937]: I0225 16:51:41.587825 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jvrhp_must-gather-n2crw_4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e/gather/0.log" Feb 25 16:51:44 crc kubenswrapper[4937]: I0225 16:51:44.368199 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:51:44 crc kubenswrapper[4937]: E0225 16:51:44.369092 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:51:50 crc kubenswrapper[4937]: I0225 16:51:50.198818 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jvrhp/must-gather-n2crw"] Feb 25 16:51:50 crc kubenswrapper[4937]: I0225 16:51:50.199696 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jvrhp/must-gather-n2crw" podUID="4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e" containerName="copy" containerID="cri-o://f59105570b7f271eb4f73dba9a935431309fb61003e7767267e2f7e144dd5b1f" gracePeriod=2 Feb 25 16:51:50 crc kubenswrapper[4937]: I0225 16:51:50.213760 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jvrhp/must-gather-n2crw"] Feb 25 16:51:50 crc kubenswrapper[4937]: I0225 16:51:50.459266 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jvrhp_must-gather-n2crw_4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e/copy/0.log" Feb 25 16:51:50 crc kubenswrapper[4937]: I0225 16:51:50.465125 4937 generic.go:334] "Generic (PLEG): container finished" podID="4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e" containerID="f59105570b7f271eb4f73dba9a935431309fb61003e7767267e2f7e144dd5b1f" exitCode=143 Feb 25 16:51:50 crc kubenswrapper[4937]: I0225 16:51:50.948148 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jvrhp_must-gather-n2crw_4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e/copy/0.log" Feb 25 16:51:50 crc kubenswrapper[4937]: I0225 16:51:50.949047 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/must-gather-n2crw" Feb 25 16:51:51 crc kubenswrapper[4937]: I0225 16:51:51.106014 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnfq9\" (UniqueName: \"kubernetes.io/projected/4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e-kube-api-access-tnfq9\") pod \"4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e\" (UID: \"4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e\") " Feb 25 16:51:51 crc kubenswrapper[4937]: I0225 16:51:51.106166 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e-must-gather-output\") pod \"4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e\" (UID: \"4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e\") " Feb 25 16:51:51 crc kubenswrapper[4937]: I0225 16:51:51.112383 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e-kube-api-access-tnfq9" (OuterVolumeSpecName: "kube-api-access-tnfq9") pod "4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e" (UID: "4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e"). InnerVolumeSpecName "kube-api-access-tnfq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:51:51 crc kubenswrapper[4937]: I0225 16:51:51.209636 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnfq9\" (UniqueName: \"kubernetes.io/projected/4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e-kube-api-access-tnfq9\") on node \"crc\" DevicePath \"\"" Feb 25 16:51:51 crc kubenswrapper[4937]: I0225 16:51:51.318473 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e" (UID: "4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:51:51 crc kubenswrapper[4937]: I0225 16:51:51.386763 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e" path="/var/lib/kubelet/pods/4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e/volumes" Feb 25 16:51:51 crc kubenswrapper[4937]: I0225 16:51:51.413577 4937 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 25 16:51:51 crc kubenswrapper[4937]: I0225 16:51:51.499758 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jvrhp_must-gather-n2crw_4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e/copy/0.log" Feb 25 16:51:51 crc kubenswrapper[4937]: I0225 16:51:51.505743 4937 scope.go:117] "RemoveContainer" containerID="f59105570b7f271eb4f73dba9a935431309fb61003e7767267e2f7e144dd5b1f" Feb 25 16:51:51 crc kubenswrapper[4937]: I0225 16:51:51.505917 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jvrhp/must-gather-n2crw" Feb 25 16:51:51 crc kubenswrapper[4937]: I0225 16:51:51.540205 4937 scope.go:117] "RemoveContainer" containerID="109a3de283c41cbb01b1aaadfaecfaa86d3ad9c4f7e00f1660b74dedaa323e5c" Feb 25 16:51:57 crc kubenswrapper[4937]: I0225 16:51:57.368359 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:51:57 crc kubenswrapper[4937]: E0225 16:51:57.369740 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.165139 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533972-8pwxc"] Feb 25 16:52:00 crc kubenswrapper[4937]: E0225 16:52:00.166953 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e" containerName="gather" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.167071 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e" containerName="gather" Feb 25 16:52:00 crc kubenswrapper[4937]: E0225 16:52:00.167171 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e" containerName="copy" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.167252 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e" containerName="copy" Feb 25 16:52:00 crc kubenswrapper[4937]: E0225 16:52:00.167340 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02b8833-357e-4e00-b36c-037e417e3acf" containerName="oc" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.167411 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02b8833-357e-4e00-b36c-037e417e3acf" containerName="oc" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.167747 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e" containerName="gather" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.167845 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b09d6a5-4cca-47fe-b1ae-15b734a6fe6e" containerName="copy" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.167950 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e02b8833-357e-4e00-b36c-037e417e3acf" containerName="oc" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.168965 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533972-8pwxc" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.173934 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533972-8pwxc"] Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.178192 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.178514 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.179224 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.300551 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crb2n\" (UniqueName: \"kubernetes.io/projected/bd05df20-2126-4248-bdb5-2e574b56e291-kube-api-access-crb2n\") pod \"auto-csr-approver-29533972-8pwxc\" (UID: \"bd05df20-2126-4248-bdb5-2e574b56e291\") " pod="openshift-infra/auto-csr-approver-29533972-8pwxc" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.402418 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crb2n\" (UniqueName: \"kubernetes.io/projected/bd05df20-2126-4248-bdb5-2e574b56e291-kube-api-access-crb2n\") pod \"auto-csr-approver-29533972-8pwxc\" (UID: \"bd05df20-2126-4248-bdb5-2e574b56e291\") " pod="openshift-infra/auto-csr-approver-29533972-8pwxc" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.419836 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crb2n\" (UniqueName: \"kubernetes.io/projected/bd05df20-2126-4248-bdb5-2e574b56e291-kube-api-access-crb2n\") pod \"auto-csr-approver-29533972-8pwxc\" (UID: \"bd05df20-2126-4248-bdb5-2e574b56e291\") " pod="openshift-infra/auto-csr-approver-29533972-8pwxc" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.493535 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533972-8pwxc" Feb 25 16:52:00 crc kubenswrapper[4937]: I0225 16:52:00.964422 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533972-8pwxc"] Feb 25 16:52:01 crc kubenswrapper[4937]: I0225 16:52:01.627293 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533972-8pwxc" event={"ID":"bd05df20-2126-4248-bdb5-2e574b56e291","Type":"ContainerStarted","Data":"2babecd2761697898581396df51dff96903692c7eb03fb277d4334d22057f5af"} Feb 25 16:52:03 crc kubenswrapper[4937]: I0225 16:52:03.645570 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533972-8pwxc" event={"ID":"bd05df20-2126-4248-bdb5-2e574b56e291","Type":"ContainerStarted","Data":"bb81505b111baf8c042ae7d60bcf01d4846b804b1ed8eef3682bd074e5f015ba"} Feb 25 16:52:03 crc kubenswrapper[4937]: I0225 16:52:03.661766 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533972-8pwxc" podStartSLOduration=1.479528022 podStartE2EDuration="3.661749692s" podCreationTimestamp="2026-02-25 16:52:00 +0000 UTC" firstStartedPulling="2026-02-25 16:52:00.968580728 +0000 UTC m=+3971.981972618" lastFinishedPulling="2026-02-25 16:52:03.150802398 +0000 UTC m=+3974.164194288" observedRunningTime="2026-02-25 16:52:03.657520677 +0000 UTC m=+3974.670912567" watchObservedRunningTime="2026-02-25 16:52:03.661749692 +0000 UTC m=+3974.675141582" Feb 25 16:52:04 crc kubenswrapper[4937]: I0225 16:52:04.658257 4937 generic.go:334] "Generic (PLEG): container finished" podID="bd05df20-2126-4248-bdb5-2e574b56e291" containerID="bb81505b111baf8c042ae7d60bcf01d4846b804b1ed8eef3682bd074e5f015ba" exitCode=0 Feb 25 16:52:04 crc kubenswrapper[4937]: I0225 16:52:04.658298 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533972-8pwxc" event={"ID":"bd05df20-2126-4248-bdb5-2e574b56e291","Type":"ContainerDied","Data":"bb81505b111baf8c042ae7d60bcf01d4846b804b1ed8eef3682bd074e5f015ba"} Feb 25 16:52:06 crc kubenswrapper[4937]: I0225 16:52:06.310011 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533972-8pwxc" Feb 25 16:52:06 crc kubenswrapper[4937]: I0225 16:52:06.426571 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crb2n\" (UniqueName: \"kubernetes.io/projected/bd05df20-2126-4248-bdb5-2e574b56e291-kube-api-access-crb2n\") pod \"bd05df20-2126-4248-bdb5-2e574b56e291\" (UID: \"bd05df20-2126-4248-bdb5-2e574b56e291\") " Feb 25 16:52:06 crc kubenswrapper[4937]: I0225 16:52:06.432934 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd05df20-2126-4248-bdb5-2e574b56e291-kube-api-access-crb2n" (OuterVolumeSpecName: "kube-api-access-crb2n") pod "bd05df20-2126-4248-bdb5-2e574b56e291" (UID: "bd05df20-2126-4248-bdb5-2e574b56e291"). InnerVolumeSpecName "kube-api-access-crb2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:52:06 crc kubenswrapper[4937]: I0225 16:52:06.529438 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crb2n\" (UniqueName: \"kubernetes.io/projected/bd05df20-2126-4248-bdb5-2e574b56e291-kube-api-access-crb2n\") on node \"crc\" DevicePath \"\"" Feb 25 16:52:06 crc kubenswrapper[4937]: I0225 16:52:06.682385 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533972-8pwxc" event={"ID":"bd05df20-2126-4248-bdb5-2e574b56e291","Type":"ContainerDied","Data":"2babecd2761697898581396df51dff96903692c7eb03fb277d4334d22057f5af"} Feb 25 16:52:06 crc kubenswrapper[4937]: I0225 16:52:06.682867 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2babecd2761697898581396df51dff96903692c7eb03fb277d4334d22057f5af" Feb 25 16:52:06 crc kubenswrapper[4937]: I0225 16:52:06.682441 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533972-8pwxc" Feb 25 16:52:06 crc kubenswrapper[4937]: I0225 16:52:06.723698 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533966-blbs6"] Feb 25 16:52:06 crc kubenswrapper[4937]: I0225 16:52:06.736041 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533966-blbs6"] Feb 25 16:52:07 crc kubenswrapper[4937]: I0225 16:52:07.378233 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b35a656-aaf5-4dfb-97fb-4b80d530e729" path="/var/lib/kubelet/pods/9b35a656-aaf5-4dfb-97fb-4b80d530e729/volumes" Feb 25 16:52:12 crc kubenswrapper[4937]: I0225 16:52:12.367263 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:52:12 crc kubenswrapper[4937]: E0225 16:52:12.368050 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:52:24 crc kubenswrapper[4937]: I0225 16:52:24.368143 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:52:24 crc kubenswrapper[4937]: E0225 16:52:24.368951 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:52:36 crc kubenswrapper[4937]: I0225 16:52:36.367849 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:52:36 crc kubenswrapper[4937]: E0225 16:52:36.368886 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:52:43 crc kubenswrapper[4937]: I0225 16:52:43.899849 4937 scope.go:117] "RemoveContainer" containerID="c664db3cb6141b0ebe690919484b0051471ebc0c271ecc246f9b2a5996858002" Feb 25 16:52:51 crc kubenswrapper[4937]: I0225 16:52:51.380728 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:52:51 crc kubenswrapper[4937]: E0225 16:52:51.381927 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:53:03 crc kubenswrapper[4937]: I0225 16:53:03.368894 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:53:03 crc kubenswrapper[4937]: E0225 16:53:03.369780 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:53:14 crc kubenswrapper[4937]: I0225 16:53:14.369597 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:53:14 crc kubenswrapper[4937]: E0225 16:53:14.371360 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.271060 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jt9d7"] Feb 25 16:53:22 crc kubenswrapper[4937]: E0225 16:53:22.272824 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd05df20-2126-4248-bdb5-2e574b56e291" containerName="oc" Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.272849 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd05df20-2126-4248-bdb5-2e574b56e291" containerName="oc" Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.273202 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd05df20-2126-4248-bdb5-2e574b56e291" containerName="oc" Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.276873 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.288671 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jt9d7"] Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.364065 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zhbr\" (UniqueName: \"kubernetes.io/projected/f86daca3-c59c-4899-94f1-f00ac31a01c8-kube-api-access-9zhbr\") pod \"redhat-operators-jt9d7\" (UID: \"f86daca3-c59c-4899-94f1-f00ac31a01c8\") " pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.364151 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f86daca3-c59c-4899-94f1-f00ac31a01c8-catalog-content\") pod \"redhat-operators-jt9d7\" (UID: \"f86daca3-c59c-4899-94f1-f00ac31a01c8\") " pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.364272 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f86daca3-c59c-4899-94f1-f00ac31a01c8-utilities\") pod \"redhat-operators-jt9d7\" (UID: \"f86daca3-c59c-4899-94f1-f00ac31a01c8\") " pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.466749 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zhbr\" (UniqueName: \"kubernetes.io/projected/f86daca3-c59c-4899-94f1-f00ac31a01c8-kube-api-access-9zhbr\") pod \"redhat-operators-jt9d7\" (UID: \"f86daca3-c59c-4899-94f1-f00ac31a01c8\") " pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.466881 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f86daca3-c59c-4899-94f1-f00ac31a01c8-catalog-content\") pod \"redhat-operators-jt9d7\" (UID: \"f86daca3-c59c-4899-94f1-f00ac31a01c8\") " pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.467068 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f86daca3-c59c-4899-94f1-f00ac31a01c8-utilities\") pod \"redhat-operators-jt9d7\" (UID: \"f86daca3-c59c-4899-94f1-f00ac31a01c8\") " pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.467863 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f86daca3-c59c-4899-94f1-f00ac31a01c8-utilities\") pod \"redhat-operators-jt9d7\" (UID: \"f86daca3-c59c-4899-94f1-f00ac31a01c8\") " pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.467874 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f86daca3-c59c-4899-94f1-f00ac31a01c8-catalog-content\") pod \"redhat-operators-jt9d7\" (UID: \"f86daca3-c59c-4899-94f1-f00ac31a01c8\") " pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.494095 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zhbr\" (UniqueName: \"kubernetes.io/projected/f86daca3-c59c-4899-94f1-f00ac31a01c8-kube-api-access-9zhbr\") pod \"redhat-operators-jt9d7\" (UID: \"f86daca3-c59c-4899-94f1-f00ac31a01c8\") " pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:22 crc kubenswrapper[4937]: I0225 16:53:22.627246 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:23 crc kubenswrapper[4937]: I0225 16:53:23.129003 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jt9d7"] Feb 25 16:53:23 crc kubenswrapper[4937]: I0225 16:53:23.469035 4937 generic.go:334] "Generic (PLEG): container finished" podID="f86daca3-c59c-4899-94f1-f00ac31a01c8" containerID="ba72ac89507183838d9221d120afe54bbcb6b0b98ef41e65e9bfe62a5fd950f6" exitCode=0 Feb 25 16:53:23 crc kubenswrapper[4937]: I0225 16:53:23.469104 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jt9d7" event={"ID":"f86daca3-c59c-4899-94f1-f00ac31a01c8","Type":"ContainerDied","Data":"ba72ac89507183838d9221d120afe54bbcb6b0b98ef41e65e9bfe62a5fd950f6"} Feb 25 16:53:23 crc kubenswrapper[4937]: I0225 16:53:23.469324 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jt9d7" event={"ID":"f86daca3-c59c-4899-94f1-f00ac31a01c8","Type":"ContainerStarted","Data":"df310d931ee680e409955e5bcd65c32f98aaf04789724091c8429d8c4972fb0a"} Feb 25 16:53:24 crc kubenswrapper[4937]: I0225 16:53:24.481220 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jt9d7" event={"ID":"f86daca3-c59c-4899-94f1-f00ac31a01c8","Type":"ContainerStarted","Data":"2dda98c99d0944449433a7437f4665571f234ddaf36ca35494f6f0571b155f25"} Feb 25 16:53:28 crc kubenswrapper[4937]: I0225 16:53:28.368421 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:53:28 crc kubenswrapper[4937]: E0225 16:53:28.369748 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:53:29 crc kubenswrapper[4937]: I0225 16:53:29.535036 4937 generic.go:334] "Generic (PLEG): container finished" podID="f86daca3-c59c-4899-94f1-f00ac31a01c8" containerID="2dda98c99d0944449433a7437f4665571f234ddaf36ca35494f6f0571b155f25" exitCode=0 Feb 25 16:53:29 crc kubenswrapper[4937]: I0225 16:53:29.535143 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jt9d7" event={"ID":"f86daca3-c59c-4899-94f1-f00ac31a01c8","Type":"ContainerDied","Data":"2dda98c99d0944449433a7437f4665571f234ddaf36ca35494f6f0571b155f25"} Feb 25 16:53:30 crc kubenswrapper[4937]: I0225 16:53:30.552175 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jt9d7" event={"ID":"f86daca3-c59c-4899-94f1-f00ac31a01c8","Type":"ContainerStarted","Data":"408113f32e7d446ebb83f7cab2b025289491f5204fa091ccf6a9a7fcee2832af"} Feb 25 16:53:30 crc kubenswrapper[4937]: I0225 16:53:30.586326 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jt9d7" podStartSLOduration=2.096758686 podStartE2EDuration="8.586303837s" podCreationTimestamp="2026-02-25 16:53:22 +0000 UTC" firstStartedPulling="2026-02-25 16:53:23.470963909 +0000 UTC m=+4054.484355789" lastFinishedPulling="2026-02-25 16:53:29.96050905 +0000 UTC m=+4060.973900940" observedRunningTime="2026-02-25 16:53:30.578926723 +0000 UTC m=+4061.592318633" watchObservedRunningTime="2026-02-25 16:53:30.586303837 +0000 UTC m=+4061.599695737" Feb 25 16:53:32 crc kubenswrapper[4937]: I0225 16:53:32.628215 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:32 crc kubenswrapper[4937]: I0225 16:53:32.629332 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:33 crc kubenswrapper[4937]: I0225 16:53:33.678512 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jt9d7" podUID="f86daca3-c59c-4899-94f1-f00ac31a01c8" containerName="registry-server" probeResult="failure" output=< Feb 25 16:53:33 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 16:53:33 crc kubenswrapper[4937]: > Feb 25 16:53:39 crc kubenswrapper[4937]: I0225 16:53:39.367812 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:53:39 crc kubenswrapper[4937]: E0225 16:53:39.368637 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 16:53:42 crc kubenswrapper[4937]: I0225 16:53:42.688933 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:42 crc kubenswrapper[4937]: I0225 16:53:42.746105 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:42 crc kubenswrapper[4937]: I0225 16:53:42.928005 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jt9d7"] Feb 25 16:53:44 crc kubenswrapper[4937]: I0225 16:53:44.713659 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jt9d7" podUID="f86daca3-c59c-4899-94f1-f00ac31a01c8" containerName="registry-server" containerID="cri-o://408113f32e7d446ebb83f7cab2b025289491f5204fa091ccf6a9a7fcee2832af" gracePeriod=2 Feb 25 16:53:44 crc kubenswrapper[4937]: E0225 16:53:44.907220 4937 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf86daca3_c59c_4899_94f1_f00ac31a01c8.slice/crio-408113f32e7d446ebb83f7cab2b025289491f5204fa091ccf6a9a7fcee2832af.scope\": RecentStats: unable to find data in memory cache]" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.444955 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.564239 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f86daca3-c59c-4899-94f1-f00ac31a01c8-catalog-content\") pod \"f86daca3-c59c-4899-94f1-f00ac31a01c8\" (UID: \"f86daca3-c59c-4899-94f1-f00ac31a01c8\") " Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.564421 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zhbr\" (UniqueName: \"kubernetes.io/projected/f86daca3-c59c-4899-94f1-f00ac31a01c8-kube-api-access-9zhbr\") pod \"f86daca3-c59c-4899-94f1-f00ac31a01c8\" (UID: \"f86daca3-c59c-4899-94f1-f00ac31a01c8\") " Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.564508 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f86daca3-c59c-4899-94f1-f00ac31a01c8-utilities\") pod \"f86daca3-c59c-4899-94f1-f00ac31a01c8\" (UID: \"f86daca3-c59c-4899-94f1-f00ac31a01c8\") " Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.565361 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f86daca3-c59c-4899-94f1-f00ac31a01c8-utilities" (OuterVolumeSpecName: "utilities") pod "f86daca3-c59c-4899-94f1-f00ac31a01c8" (UID: "f86daca3-c59c-4899-94f1-f00ac31a01c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.572219 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f86daca3-c59c-4899-94f1-f00ac31a01c8-kube-api-access-9zhbr" (OuterVolumeSpecName: "kube-api-access-9zhbr") pod "f86daca3-c59c-4899-94f1-f00ac31a01c8" (UID: "f86daca3-c59c-4899-94f1-f00ac31a01c8"). InnerVolumeSpecName "kube-api-access-9zhbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.667078 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zhbr\" (UniqueName: \"kubernetes.io/projected/f86daca3-c59c-4899-94f1-f00ac31a01c8-kube-api-access-9zhbr\") on node \"crc\" DevicePath \"\"" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.667101 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f86daca3-c59c-4899-94f1-f00ac31a01c8-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.730411 4937 generic.go:334] "Generic (PLEG): container finished" podID="f86daca3-c59c-4899-94f1-f00ac31a01c8" containerID="408113f32e7d446ebb83f7cab2b025289491f5204fa091ccf6a9a7fcee2832af" exitCode=0 Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.730460 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jt9d7" event={"ID":"f86daca3-c59c-4899-94f1-f00ac31a01c8","Type":"ContainerDied","Data":"408113f32e7d446ebb83f7cab2b025289491f5204fa091ccf6a9a7fcee2832af"} Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.730473 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jt9d7" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.730511 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jt9d7" event={"ID":"f86daca3-c59c-4899-94f1-f00ac31a01c8","Type":"ContainerDied","Data":"df310d931ee680e409955e5bcd65c32f98aaf04789724091c8429d8c4972fb0a"} Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.730538 4937 scope.go:117] "RemoveContainer" containerID="408113f32e7d446ebb83f7cab2b025289491f5204fa091ccf6a9a7fcee2832af" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.730878 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f86daca3-c59c-4899-94f1-f00ac31a01c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f86daca3-c59c-4899-94f1-f00ac31a01c8" (UID: "f86daca3-c59c-4899-94f1-f00ac31a01c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.759773 4937 scope.go:117] "RemoveContainer" containerID="2dda98c99d0944449433a7437f4665571f234ddaf36ca35494f6f0571b155f25" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.787975 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f86daca3-c59c-4899-94f1-f00ac31a01c8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.788012 4937 scope.go:117] "RemoveContainer" containerID="ba72ac89507183838d9221d120afe54bbcb6b0b98ef41e65e9bfe62a5fd950f6" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.848829 4937 scope.go:117] "RemoveContainer" containerID="408113f32e7d446ebb83f7cab2b025289491f5204fa091ccf6a9a7fcee2832af" Feb 25 16:53:45 crc kubenswrapper[4937]: E0225 16:53:45.855545 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408113f32e7d446ebb83f7cab2b025289491f5204fa091ccf6a9a7fcee2832af\": container with ID starting with 408113f32e7d446ebb83f7cab2b025289491f5204fa091ccf6a9a7fcee2832af not found: ID does not exist" containerID="408113f32e7d446ebb83f7cab2b025289491f5204fa091ccf6a9a7fcee2832af" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.855610 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408113f32e7d446ebb83f7cab2b025289491f5204fa091ccf6a9a7fcee2832af"} err="failed to get container status \"408113f32e7d446ebb83f7cab2b025289491f5204fa091ccf6a9a7fcee2832af\": rpc error: code = NotFound desc = could not find container \"408113f32e7d446ebb83f7cab2b025289491f5204fa091ccf6a9a7fcee2832af\": container with ID starting with 408113f32e7d446ebb83f7cab2b025289491f5204fa091ccf6a9a7fcee2832af not found: ID does not exist" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.855645 4937 scope.go:117] "RemoveContainer" containerID="2dda98c99d0944449433a7437f4665571f234ddaf36ca35494f6f0571b155f25" Feb 25 16:53:45 crc kubenswrapper[4937]: E0225 16:53:45.856043 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dda98c99d0944449433a7437f4665571f234ddaf36ca35494f6f0571b155f25\": container with ID starting with 2dda98c99d0944449433a7437f4665571f234ddaf36ca35494f6f0571b155f25 not found: ID does not exist" containerID="2dda98c99d0944449433a7437f4665571f234ddaf36ca35494f6f0571b155f25" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.856076 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dda98c99d0944449433a7437f4665571f234ddaf36ca35494f6f0571b155f25"} err="failed to get container status \"2dda98c99d0944449433a7437f4665571f234ddaf36ca35494f6f0571b155f25\": rpc error: code = NotFound desc = could not find container \"2dda98c99d0944449433a7437f4665571f234ddaf36ca35494f6f0571b155f25\": container with ID starting with 2dda98c99d0944449433a7437f4665571f234ddaf36ca35494f6f0571b155f25 not found: ID does not exist" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.856097 4937 scope.go:117] "RemoveContainer" containerID="ba72ac89507183838d9221d120afe54bbcb6b0b98ef41e65e9bfe62a5fd950f6" Feb 25 16:53:45 crc kubenswrapper[4937]: E0225 16:53:45.856868 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba72ac89507183838d9221d120afe54bbcb6b0b98ef41e65e9bfe62a5fd950f6\": container with ID starting with ba72ac89507183838d9221d120afe54bbcb6b0b98ef41e65e9bfe62a5fd950f6 not found: ID does not exist" containerID="ba72ac89507183838d9221d120afe54bbcb6b0b98ef41e65e9bfe62a5fd950f6" Feb 25 16:53:45 crc kubenswrapper[4937]: I0225 16:53:45.856901 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba72ac89507183838d9221d120afe54bbcb6b0b98ef41e65e9bfe62a5fd950f6"} err="failed to get container status \"ba72ac89507183838d9221d120afe54bbcb6b0b98ef41e65e9bfe62a5fd950f6\": rpc error: code = NotFound desc = could not find container \"ba72ac89507183838d9221d120afe54bbcb6b0b98ef41e65e9bfe62a5fd950f6\": container with ID starting with ba72ac89507183838d9221d120afe54bbcb6b0b98ef41e65e9bfe62a5fd950f6 not found: ID does not exist" Feb 25 16:53:46 crc kubenswrapper[4937]: I0225 16:53:46.063516 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jt9d7"] Feb 25 16:53:46 crc kubenswrapper[4937]: I0225 16:53:46.071771 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jt9d7"] Feb 25 16:53:47 crc kubenswrapper[4937]: I0225 16:53:47.383883 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f86daca3-c59c-4899-94f1-f00ac31a01c8" path="/var/lib/kubelet/pods/f86daca3-c59c-4899-94f1-f00ac31a01c8/volumes" Feb 25 16:53:52 crc kubenswrapper[4937]: I0225 16:53:52.368027 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:53:52 crc kubenswrapper[4937]: I0225 16:53:52.812893 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"0ce94622471d9659329394fa1b2af5fd3461490cd063464e738059704ab88ac2"} Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.190057 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533974-p9qq8"] Feb 25 16:54:00 crc kubenswrapper[4937]: E0225 16:54:00.190891 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86daca3-c59c-4899-94f1-f00ac31a01c8" containerName="registry-server" Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.190904 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86daca3-c59c-4899-94f1-f00ac31a01c8" containerName="registry-server" Feb 25 16:54:00 crc kubenswrapper[4937]: E0225 16:54:00.190915 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86daca3-c59c-4899-94f1-f00ac31a01c8" containerName="extract-utilities" Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.190921 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86daca3-c59c-4899-94f1-f00ac31a01c8" containerName="extract-utilities" Feb 25 16:54:00 crc kubenswrapper[4937]: E0225 16:54:00.190935 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86daca3-c59c-4899-94f1-f00ac31a01c8" containerName="extract-content" Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.190941 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86daca3-c59c-4899-94f1-f00ac31a01c8" containerName="extract-content" Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.191156 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="f86daca3-c59c-4899-94f1-f00ac31a01c8" containerName="registry-server" Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.191866 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533974-p9qq8" Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.198205 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.199020 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.199216 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.206768 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533974-p9qq8"] Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.300864 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22s9z\" (UniqueName: \"kubernetes.io/projected/eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03-kube-api-access-22s9z\") pod \"auto-csr-approver-29533974-p9qq8\" (UID: \"eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03\") " pod="openshift-infra/auto-csr-approver-29533974-p9qq8" Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.403017 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22s9z\" (UniqueName: \"kubernetes.io/projected/eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03-kube-api-access-22s9z\") pod \"auto-csr-approver-29533974-p9qq8\" (UID: \"eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03\") " pod="openshift-infra/auto-csr-approver-29533974-p9qq8" Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.422233 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22s9z\" (UniqueName: \"kubernetes.io/projected/eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03-kube-api-access-22s9z\") pod \"auto-csr-approver-29533974-p9qq8\" (UID: \"eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03\") " pod="openshift-infra/auto-csr-approver-29533974-p9qq8" Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.514838 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533974-p9qq8" Feb 25 16:54:00 crc kubenswrapper[4937]: I0225 16:54:00.940459 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533974-p9qq8"] Feb 25 16:54:01 crc kubenswrapper[4937]: I0225 16:54:01.944635 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533974-p9qq8" event={"ID":"eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03","Type":"ContainerStarted","Data":"55e4a198eb065fcacd4ae7c69ca4acec587e11ec55d61a2956f6a0ee34403cad"} Feb 25 16:54:02 crc kubenswrapper[4937]: I0225 16:54:02.958261 4937 generic.go:334] "Generic (PLEG): container finished" podID="eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03" containerID="44e5fad3959c5a8bcc1467137d12bc42ab0bd19b9b02df271413ee76bf7e13bd" exitCode=0 Feb 25 16:54:02 crc kubenswrapper[4937]: I0225 16:54:02.958344 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533974-p9qq8" event={"ID":"eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03","Type":"ContainerDied","Data":"44e5fad3959c5a8bcc1467137d12bc42ab0bd19b9b02df271413ee76bf7e13bd"} Feb 25 16:54:04 crc kubenswrapper[4937]: I0225 16:54:04.574312 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533974-p9qq8" Feb 25 16:54:04 crc kubenswrapper[4937]: I0225 16:54:04.695318 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22s9z\" (UniqueName: \"kubernetes.io/projected/eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03-kube-api-access-22s9z\") pod \"eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03\" (UID: \"eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03\") " Feb 25 16:54:04 crc kubenswrapper[4937]: I0225 16:54:04.699911 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03-kube-api-access-22s9z" (OuterVolumeSpecName: "kube-api-access-22s9z") pod "eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03" (UID: "eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03"). InnerVolumeSpecName "kube-api-access-22s9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:54:04 crc kubenswrapper[4937]: I0225 16:54:04.797725 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22s9z\" (UniqueName: \"kubernetes.io/projected/eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03-kube-api-access-22s9z\") on node \"crc\" DevicePath \"\"" Feb 25 16:54:04 crc kubenswrapper[4937]: I0225 16:54:04.979524 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533974-p9qq8" event={"ID":"eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03","Type":"ContainerDied","Data":"55e4a198eb065fcacd4ae7c69ca4acec587e11ec55d61a2956f6a0ee34403cad"} Feb 25 16:54:04 crc kubenswrapper[4937]: I0225 16:54:04.979560 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55e4a198eb065fcacd4ae7c69ca4acec587e11ec55d61a2956f6a0ee34403cad" Feb 25 16:54:04 crc kubenswrapper[4937]: I0225 16:54:04.979607 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533974-p9qq8" Feb 25 16:54:05 crc kubenswrapper[4937]: I0225 16:54:05.641077 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533968-z95bx"] Feb 25 16:54:05 crc kubenswrapper[4937]: I0225 16:54:05.650344 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533968-z95bx"] Feb 25 16:54:07 crc kubenswrapper[4937]: I0225 16:54:07.387827 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d1be3d1-6300-4b05-817c-9681567a5f8d" path="/var/lib/kubelet/pods/4d1be3d1-6300-4b05-817c-9681567a5f8d/volumes" Feb 25 16:54:44 crc kubenswrapper[4937]: I0225 16:54:44.017981 4937 scope.go:117] "RemoveContainer" containerID="928baeafa4e24ae587779daec1d9e7ea0e35351405dd46039e2ce6a0bb05cd46" Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.525505 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bs75f/must-gather-6bkg2"] Feb 25 16:54:53 crc kubenswrapper[4937]: E0225 16:54:53.526739 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03" containerName="oc" Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.526761 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03" containerName="oc" Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.526983 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03" containerName="oc" Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.528505 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/must-gather-6bkg2" Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.531076 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bs75f"/"openshift-service-ca.crt" Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.531289 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bs75f"/"kube-root-ca.crt" Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.531302 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bs75f"/"default-dockercfg-65wsr" Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.551711 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db789b16-3221-4d7f-a3ac-10a2b3169ad5-must-gather-output\") pod \"must-gather-6bkg2\" (UID: \"db789b16-3221-4d7f-a3ac-10a2b3169ad5\") " pod="openshift-must-gather-bs75f/must-gather-6bkg2" Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.552074 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tplz7\" (UniqueName: \"kubernetes.io/projected/db789b16-3221-4d7f-a3ac-10a2b3169ad5-kube-api-access-tplz7\") pod \"must-gather-6bkg2\" (UID: \"db789b16-3221-4d7f-a3ac-10a2b3169ad5\") " pod="openshift-must-gather-bs75f/must-gather-6bkg2" Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.558581 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bs75f/must-gather-6bkg2"] Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.654494 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db789b16-3221-4d7f-a3ac-10a2b3169ad5-must-gather-output\") pod \"must-gather-6bkg2\" (UID: \"db789b16-3221-4d7f-a3ac-10a2b3169ad5\") " pod="openshift-must-gather-bs75f/must-gather-6bkg2" Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.654635 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tplz7\" (UniqueName: \"kubernetes.io/projected/db789b16-3221-4d7f-a3ac-10a2b3169ad5-kube-api-access-tplz7\") pod \"must-gather-6bkg2\" (UID: \"db789b16-3221-4d7f-a3ac-10a2b3169ad5\") " pod="openshift-must-gather-bs75f/must-gather-6bkg2" Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.655178 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db789b16-3221-4d7f-a3ac-10a2b3169ad5-must-gather-output\") pod \"must-gather-6bkg2\" (UID: \"db789b16-3221-4d7f-a3ac-10a2b3169ad5\") " pod="openshift-must-gather-bs75f/must-gather-6bkg2" Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.694611 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tplz7\" (UniqueName: \"kubernetes.io/projected/db789b16-3221-4d7f-a3ac-10a2b3169ad5-kube-api-access-tplz7\") pod \"must-gather-6bkg2\" (UID: \"db789b16-3221-4d7f-a3ac-10a2b3169ad5\") " pod="openshift-must-gather-bs75f/must-gather-6bkg2" Feb 25 16:54:53 crc kubenswrapper[4937]: I0225 16:54:53.851511 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/must-gather-6bkg2" Feb 25 16:54:54 crc kubenswrapper[4937]: I0225 16:54:54.316635 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bs75f/must-gather-6bkg2"] Feb 25 16:54:54 crc kubenswrapper[4937]: I0225 16:54:54.567265 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bs75f/must-gather-6bkg2" event={"ID":"db789b16-3221-4d7f-a3ac-10a2b3169ad5","Type":"ContainerStarted","Data":"12e5065a685fef7f5652c024fe5bf8d906819ec939beab25954cf0b23b9d5a24"} Feb 25 16:54:55 crc kubenswrapper[4937]: I0225 16:54:55.577189 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bs75f/must-gather-6bkg2" event={"ID":"db789b16-3221-4d7f-a3ac-10a2b3169ad5","Type":"ContainerStarted","Data":"ed4c3ada1d83a22ab98d9fb4490da1984b3d1421a1537db9aa87408f6d218b5f"} Feb 25 16:54:55 crc kubenswrapper[4937]: I0225 16:54:55.578394 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bs75f/must-gather-6bkg2" event={"ID":"db789b16-3221-4d7f-a3ac-10a2b3169ad5","Type":"ContainerStarted","Data":"9a3b00aaa1d1a76a1dded6ad57e9c8e0b2c2c4672bd73e374ee55fb5c9c643b0"} Feb 25 16:54:55 crc kubenswrapper[4937]: I0225 16:54:55.598858 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bs75f/must-gather-6bkg2" podStartSLOduration=2.5988405439999998 podStartE2EDuration="2.598840544s" podCreationTimestamp="2026-02-25 16:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:54:55.592204378 +0000 UTC m=+4146.605596268" watchObservedRunningTime="2026-02-25 16:54:55.598840544 +0000 UTC m=+4146.612232424" Feb 25 16:54:58 crc kubenswrapper[4937]: I0225 16:54:58.595683 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bs75f/crc-debug-qxvw4"] Feb 25 16:54:58 crc kubenswrapper[4937]: I0225 16:54:58.597991 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/crc-debug-qxvw4" Feb 25 16:54:58 crc kubenswrapper[4937]: I0225 16:54:58.651406 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj975\" (UniqueName: \"kubernetes.io/projected/0a50c32f-8295-47f6-ac4c-fc110e3f9dfe-kube-api-access-rj975\") pod \"crc-debug-qxvw4\" (UID: \"0a50c32f-8295-47f6-ac4c-fc110e3f9dfe\") " pod="openshift-must-gather-bs75f/crc-debug-qxvw4" Feb 25 16:54:58 crc kubenswrapper[4937]: I0225 16:54:58.651772 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a50c32f-8295-47f6-ac4c-fc110e3f9dfe-host\") pod \"crc-debug-qxvw4\" (UID: \"0a50c32f-8295-47f6-ac4c-fc110e3f9dfe\") " pod="openshift-must-gather-bs75f/crc-debug-qxvw4" Feb 25 16:54:58 crc kubenswrapper[4937]: I0225 16:54:58.754236 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a50c32f-8295-47f6-ac4c-fc110e3f9dfe-host\") pod \"crc-debug-qxvw4\" (UID: \"0a50c32f-8295-47f6-ac4c-fc110e3f9dfe\") " pod="openshift-must-gather-bs75f/crc-debug-qxvw4" Feb 25 16:54:58 crc kubenswrapper[4937]: I0225 16:54:58.754406 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj975\" (UniqueName: \"kubernetes.io/projected/0a50c32f-8295-47f6-ac4c-fc110e3f9dfe-kube-api-access-rj975\") pod \"crc-debug-qxvw4\" (UID: \"0a50c32f-8295-47f6-ac4c-fc110e3f9dfe\") " pod="openshift-must-gather-bs75f/crc-debug-qxvw4" Feb 25 16:54:58 crc kubenswrapper[4937]: I0225 16:54:58.754424 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a50c32f-8295-47f6-ac4c-fc110e3f9dfe-host\") pod \"crc-debug-qxvw4\" (UID: \"0a50c32f-8295-47f6-ac4c-fc110e3f9dfe\") " pod="openshift-must-gather-bs75f/crc-debug-qxvw4" Feb 25 16:54:58 crc kubenswrapper[4937]: I0225 16:54:58.775501 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj975\" (UniqueName: \"kubernetes.io/projected/0a50c32f-8295-47f6-ac4c-fc110e3f9dfe-kube-api-access-rj975\") pod \"crc-debug-qxvw4\" (UID: \"0a50c32f-8295-47f6-ac4c-fc110e3f9dfe\") " pod="openshift-must-gather-bs75f/crc-debug-qxvw4" Feb 25 16:54:58 crc kubenswrapper[4937]: I0225 16:54:58.919514 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/crc-debug-qxvw4" Feb 25 16:54:58 crc kubenswrapper[4937]: W0225 16:54:58.944063 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a50c32f_8295_47f6_ac4c_fc110e3f9dfe.slice/crio-8978c78d514686da1d66e04738213ad11d87e249f19ec17986275ba3370f6a8f WatchSource:0}: Error finding container 8978c78d514686da1d66e04738213ad11d87e249f19ec17986275ba3370f6a8f: Status 404 returned error can't find the container with id 8978c78d514686da1d66e04738213ad11d87e249f19ec17986275ba3370f6a8f Feb 25 16:54:59 crc kubenswrapper[4937]: I0225 16:54:59.609245 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bs75f/crc-debug-qxvw4" event={"ID":"0a50c32f-8295-47f6-ac4c-fc110e3f9dfe","Type":"ContainerStarted","Data":"59eb7494a8a40b7fd5e93832e4bc34e2c9de366e64419d8c2a311ad00aa6ee96"} Feb 25 16:54:59 crc kubenswrapper[4937]: I0225 16:54:59.609619 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bs75f/crc-debug-qxvw4" event={"ID":"0a50c32f-8295-47f6-ac4c-fc110e3f9dfe","Type":"ContainerStarted","Data":"8978c78d514686da1d66e04738213ad11d87e249f19ec17986275ba3370f6a8f"} Feb 25 16:54:59 crc kubenswrapper[4937]: I0225 16:54:59.627750 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bs75f/crc-debug-qxvw4" podStartSLOduration=1.627732178 podStartE2EDuration="1.627732178s" podCreationTimestamp="2026-02-25 16:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:54:59.622313192 +0000 UTC m=+4150.635705082" watchObservedRunningTime="2026-02-25 16:54:59.627732178 +0000 UTC m=+4150.641124068" Feb 25 16:55:52 crc kubenswrapper[4937]: I0225 16:55:52.087379 4937 generic.go:334] "Generic (PLEG): container finished" podID="0a50c32f-8295-47f6-ac4c-fc110e3f9dfe" containerID="59eb7494a8a40b7fd5e93832e4bc34e2c9de366e64419d8c2a311ad00aa6ee96" exitCode=0 Feb 25 16:55:52 crc kubenswrapper[4937]: I0225 16:55:52.087533 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bs75f/crc-debug-qxvw4" event={"ID":"0a50c32f-8295-47f6-ac4c-fc110e3f9dfe","Type":"ContainerDied","Data":"59eb7494a8a40b7fd5e93832e4bc34e2c9de366e64419d8c2a311ad00aa6ee96"} Feb 25 16:55:53 crc kubenswrapper[4937]: I0225 16:55:53.176891 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/crc-debug-qxvw4" Feb 25 16:55:53 crc kubenswrapper[4937]: I0225 16:55:53.208155 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bs75f/crc-debug-qxvw4"] Feb 25 16:55:53 crc kubenswrapper[4937]: I0225 16:55:53.216342 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bs75f/crc-debug-qxvw4"] Feb 25 16:55:53 crc kubenswrapper[4937]: I0225 16:55:53.294803 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a50c32f-8295-47f6-ac4c-fc110e3f9dfe-host\") pod \"0a50c32f-8295-47f6-ac4c-fc110e3f9dfe\" (UID: \"0a50c32f-8295-47f6-ac4c-fc110e3f9dfe\") " Feb 25 16:55:53 crc kubenswrapper[4937]: I0225 16:55:53.294911 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a50c32f-8295-47f6-ac4c-fc110e3f9dfe-host" (OuterVolumeSpecName: "host") pod "0a50c32f-8295-47f6-ac4c-fc110e3f9dfe" (UID: "0a50c32f-8295-47f6-ac4c-fc110e3f9dfe"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:55:53 crc kubenswrapper[4937]: I0225 16:55:53.295142 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj975\" (UniqueName: \"kubernetes.io/projected/0a50c32f-8295-47f6-ac4c-fc110e3f9dfe-kube-api-access-rj975\") pod \"0a50c32f-8295-47f6-ac4c-fc110e3f9dfe\" (UID: \"0a50c32f-8295-47f6-ac4c-fc110e3f9dfe\") " Feb 25 16:55:53 crc kubenswrapper[4937]: I0225 16:55:53.295587 4937 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a50c32f-8295-47f6-ac4c-fc110e3f9dfe-host\") on node \"crc\" DevicePath \"\"" Feb 25 16:55:53 crc kubenswrapper[4937]: I0225 16:55:53.300884 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a50c32f-8295-47f6-ac4c-fc110e3f9dfe-kube-api-access-rj975" (OuterVolumeSpecName: "kube-api-access-rj975") pod "0a50c32f-8295-47f6-ac4c-fc110e3f9dfe" (UID: "0a50c32f-8295-47f6-ac4c-fc110e3f9dfe"). InnerVolumeSpecName "kube-api-access-rj975". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:55:53 crc kubenswrapper[4937]: I0225 16:55:53.377313 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a50c32f-8295-47f6-ac4c-fc110e3f9dfe" path="/var/lib/kubelet/pods/0a50c32f-8295-47f6-ac4c-fc110e3f9dfe/volumes" Feb 25 16:55:53 crc kubenswrapper[4937]: I0225 16:55:53.397017 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj975\" (UniqueName: \"kubernetes.io/projected/0a50c32f-8295-47f6-ac4c-fc110e3f9dfe-kube-api-access-rj975\") on node \"crc\" DevicePath \"\"" Feb 25 16:55:54 crc kubenswrapper[4937]: I0225 16:55:54.111991 4937 scope.go:117] "RemoveContainer" containerID="59eb7494a8a40b7fd5e93832e4bc34e2c9de366e64419d8c2a311ad00aa6ee96" Feb 25 16:55:54 crc kubenswrapper[4937]: I0225 16:55:54.112126 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/crc-debug-qxvw4" Feb 25 16:55:54 crc kubenswrapper[4937]: I0225 16:55:54.477436 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bs75f/crc-debug-dzlx8"] Feb 25 16:55:54 crc kubenswrapper[4937]: E0225 16:55:54.478266 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a50c32f-8295-47f6-ac4c-fc110e3f9dfe" containerName="container-00" Feb 25 16:55:54 crc kubenswrapper[4937]: I0225 16:55:54.478280 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a50c32f-8295-47f6-ac4c-fc110e3f9dfe" containerName="container-00" Feb 25 16:55:54 crc kubenswrapper[4937]: I0225 16:55:54.479047 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a50c32f-8295-47f6-ac4c-fc110e3f9dfe" containerName="container-00" Feb 25 16:55:54 crc kubenswrapper[4937]: I0225 16:55:54.479981 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/crc-debug-dzlx8" Feb 25 16:55:54 crc kubenswrapper[4937]: I0225 16:55:54.625337 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44f5f3c4-6f44-48d4-8e84-7080285ccb5b-host\") pod \"crc-debug-dzlx8\" (UID: \"44f5f3c4-6f44-48d4-8e84-7080285ccb5b\") " pod="openshift-must-gather-bs75f/crc-debug-dzlx8" Feb 25 16:55:54 crc kubenswrapper[4937]: I0225 16:55:54.625407 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-827l9\" (UniqueName: \"kubernetes.io/projected/44f5f3c4-6f44-48d4-8e84-7080285ccb5b-kube-api-access-827l9\") pod \"crc-debug-dzlx8\" (UID: \"44f5f3c4-6f44-48d4-8e84-7080285ccb5b\") " pod="openshift-must-gather-bs75f/crc-debug-dzlx8" Feb 25 16:55:54 crc kubenswrapper[4937]: I0225 16:55:54.727179 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-827l9\" (UniqueName: \"kubernetes.io/projected/44f5f3c4-6f44-48d4-8e84-7080285ccb5b-kube-api-access-827l9\") pod \"crc-debug-dzlx8\" (UID: \"44f5f3c4-6f44-48d4-8e84-7080285ccb5b\") " pod="openshift-must-gather-bs75f/crc-debug-dzlx8" Feb 25 16:55:54 crc kubenswrapper[4937]: I0225 16:55:54.727221 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44f5f3c4-6f44-48d4-8e84-7080285ccb5b-host\") pod \"crc-debug-dzlx8\" (UID: \"44f5f3c4-6f44-48d4-8e84-7080285ccb5b\") " pod="openshift-must-gather-bs75f/crc-debug-dzlx8" Feb 25 16:55:54 crc kubenswrapper[4937]: I0225 16:55:54.727353 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44f5f3c4-6f44-48d4-8e84-7080285ccb5b-host\") pod \"crc-debug-dzlx8\" (UID: \"44f5f3c4-6f44-48d4-8e84-7080285ccb5b\") " pod="openshift-must-gather-bs75f/crc-debug-dzlx8" Feb 25 16:55:54 crc kubenswrapper[4937]: I0225 16:55:54.746179 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-827l9\" (UniqueName: \"kubernetes.io/projected/44f5f3c4-6f44-48d4-8e84-7080285ccb5b-kube-api-access-827l9\") pod \"crc-debug-dzlx8\" (UID: \"44f5f3c4-6f44-48d4-8e84-7080285ccb5b\") " pod="openshift-must-gather-bs75f/crc-debug-dzlx8" Feb 25 16:55:54 crc kubenswrapper[4937]: I0225 16:55:54.799896 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/crc-debug-dzlx8" Feb 25 16:55:54 crc kubenswrapper[4937]: W0225 16:55:54.833303 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44f5f3c4_6f44_48d4_8e84_7080285ccb5b.slice/crio-88a689a7dbd1052a1e6bdb909f401fafec27d71f58aae16553be7f830d738439 WatchSource:0}: Error finding container 88a689a7dbd1052a1e6bdb909f401fafec27d71f58aae16553be7f830d738439: Status 404 returned error can't find the container with id 88a689a7dbd1052a1e6bdb909f401fafec27d71f58aae16553be7f830d738439 Feb 25 16:55:55 crc kubenswrapper[4937]: I0225 16:55:55.121229 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bs75f/crc-debug-dzlx8" event={"ID":"44f5f3c4-6f44-48d4-8e84-7080285ccb5b","Type":"ContainerStarted","Data":"64970219d91036680ccfee4189a6a09a972b041d4a058b5979bb17c88c20eac8"} Feb 25 16:55:55 crc kubenswrapper[4937]: I0225 16:55:55.121698 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bs75f/crc-debug-dzlx8" event={"ID":"44f5f3c4-6f44-48d4-8e84-7080285ccb5b","Type":"ContainerStarted","Data":"88a689a7dbd1052a1e6bdb909f401fafec27d71f58aae16553be7f830d738439"} Feb 25 16:55:55 crc kubenswrapper[4937]: I0225 16:55:55.139774 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bs75f/crc-debug-dzlx8" podStartSLOduration=1.139760584 podStartE2EDuration="1.139760584s" podCreationTimestamp="2026-02-25 16:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 16:55:55.135144118 +0000 UTC m=+4206.148536008" watchObservedRunningTime="2026-02-25 16:55:55.139760584 +0000 UTC m=+4206.153152474" Feb 25 16:55:56 crc kubenswrapper[4937]: I0225 16:55:56.135598 4937 generic.go:334] "Generic (PLEG): container finished" podID="44f5f3c4-6f44-48d4-8e84-7080285ccb5b" containerID="64970219d91036680ccfee4189a6a09a972b041d4a058b5979bb17c88c20eac8" exitCode=0 Feb 25 16:55:56 crc kubenswrapper[4937]: I0225 16:55:56.135852 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bs75f/crc-debug-dzlx8" event={"ID":"44f5f3c4-6f44-48d4-8e84-7080285ccb5b","Type":"ContainerDied","Data":"64970219d91036680ccfee4189a6a09a972b041d4a058b5979bb17c88c20eac8"} Feb 25 16:55:57 crc kubenswrapper[4937]: I0225 16:55:57.240891 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/crc-debug-dzlx8" Feb 25 16:55:57 crc kubenswrapper[4937]: I0225 16:55:57.270796 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bs75f/crc-debug-dzlx8"] Feb 25 16:55:57 crc kubenswrapper[4937]: I0225 16:55:57.279771 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bs75f/crc-debug-dzlx8"] Feb 25 16:55:57 crc kubenswrapper[4937]: I0225 16:55:57.387914 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44f5f3c4-6f44-48d4-8e84-7080285ccb5b-host\") pod \"44f5f3c4-6f44-48d4-8e84-7080285ccb5b\" (UID: \"44f5f3c4-6f44-48d4-8e84-7080285ccb5b\") " Feb 25 16:55:57 crc kubenswrapper[4937]: I0225 16:55:57.388227 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-827l9\" (UniqueName: \"kubernetes.io/projected/44f5f3c4-6f44-48d4-8e84-7080285ccb5b-kube-api-access-827l9\") pod \"44f5f3c4-6f44-48d4-8e84-7080285ccb5b\" (UID: \"44f5f3c4-6f44-48d4-8e84-7080285ccb5b\") " Feb 25 16:55:57 crc kubenswrapper[4937]: I0225 16:55:57.388228 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44f5f3c4-6f44-48d4-8e84-7080285ccb5b-host" (OuterVolumeSpecName: "host") pod "44f5f3c4-6f44-48d4-8e84-7080285ccb5b" (UID: "44f5f3c4-6f44-48d4-8e84-7080285ccb5b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:55:57 crc kubenswrapper[4937]: I0225 16:55:57.407452 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f5f3c4-6f44-48d4-8e84-7080285ccb5b-kube-api-access-827l9" (OuterVolumeSpecName: "kube-api-access-827l9") pod "44f5f3c4-6f44-48d4-8e84-7080285ccb5b" (UID: "44f5f3c4-6f44-48d4-8e84-7080285ccb5b"). InnerVolumeSpecName "kube-api-access-827l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:55:57 crc kubenswrapper[4937]: I0225 16:55:57.424194 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f5f3c4-6f44-48d4-8e84-7080285ccb5b" path="/var/lib/kubelet/pods/44f5f3c4-6f44-48d4-8e84-7080285ccb5b/volumes" Feb 25 16:55:57 crc kubenswrapper[4937]: I0225 16:55:57.490943 4937 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44f5f3c4-6f44-48d4-8e84-7080285ccb5b-host\") on node \"crc\" DevicePath \"\"" Feb 25 16:55:57 crc kubenswrapper[4937]: I0225 16:55:57.491130 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-827l9\" (UniqueName: \"kubernetes.io/projected/44f5f3c4-6f44-48d4-8e84-7080285ccb5b-kube-api-access-827l9\") on node \"crc\" DevicePath \"\"" Feb 25 16:55:58 crc kubenswrapper[4937]: I0225 16:55:58.153463 4937 scope.go:117] "RemoveContainer" containerID="64970219d91036680ccfee4189a6a09a972b041d4a058b5979bb17c88c20eac8" Feb 25 16:55:58 crc kubenswrapper[4937]: I0225 16:55:58.153554 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/crc-debug-dzlx8" Feb 25 16:55:58 crc kubenswrapper[4937]: I0225 16:55:58.510320 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bs75f/crc-debug-88hrl"] Feb 25 16:55:58 crc kubenswrapper[4937]: E0225 16:55:58.511157 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f5f3c4-6f44-48d4-8e84-7080285ccb5b" containerName="container-00" Feb 25 16:55:58 crc kubenswrapper[4937]: I0225 16:55:58.511173 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f5f3c4-6f44-48d4-8e84-7080285ccb5b" containerName="container-00" Feb 25 16:55:58 crc kubenswrapper[4937]: I0225 16:55:58.511466 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f5f3c4-6f44-48d4-8e84-7080285ccb5b" containerName="container-00" Feb 25 16:55:58 crc kubenswrapper[4937]: I0225 16:55:58.512434 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/crc-debug-88hrl" Feb 25 16:55:58 crc kubenswrapper[4937]: I0225 16:55:58.612614 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e14c844-f118-41c9-bd51-a63b1e5898c5-host\") pod \"crc-debug-88hrl\" (UID: \"6e14c844-f118-41c9-bd51-a63b1e5898c5\") " pod="openshift-must-gather-bs75f/crc-debug-88hrl" Feb 25 16:55:58 crc kubenswrapper[4937]: I0225 16:55:58.612676 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbsft\" (UniqueName: \"kubernetes.io/projected/6e14c844-f118-41c9-bd51-a63b1e5898c5-kube-api-access-dbsft\") pod \"crc-debug-88hrl\" (UID: \"6e14c844-f118-41c9-bd51-a63b1e5898c5\") " pod="openshift-must-gather-bs75f/crc-debug-88hrl" Feb 25 16:55:58 crc kubenswrapper[4937]: I0225 16:55:58.715151 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e14c844-f118-41c9-bd51-a63b1e5898c5-host\") pod \"crc-debug-88hrl\" (UID: \"6e14c844-f118-41c9-bd51-a63b1e5898c5\") " pod="openshift-must-gather-bs75f/crc-debug-88hrl" Feb 25 16:55:58 crc kubenswrapper[4937]: I0225 16:55:58.715336 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e14c844-f118-41c9-bd51-a63b1e5898c5-host\") pod \"crc-debug-88hrl\" (UID: \"6e14c844-f118-41c9-bd51-a63b1e5898c5\") " pod="openshift-must-gather-bs75f/crc-debug-88hrl" Feb 25 16:55:58 crc kubenswrapper[4937]: I0225 16:55:58.715653 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbsft\" (UniqueName: \"kubernetes.io/projected/6e14c844-f118-41c9-bd51-a63b1e5898c5-kube-api-access-dbsft\") pod \"crc-debug-88hrl\" (UID: \"6e14c844-f118-41c9-bd51-a63b1e5898c5\") " pod="openshift-must-gather-bs75f/crc-debug-88hrl" Feb 25 16:55:58 crc kubenswrapper[4937]: I0225 16:55:58.737582 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbsft\" (UniqueName: \"kubernetes.io/projected/6e14c844-f118-41c9-bd51-a63b1e5898c5-kube-api-access-dbsft\") pod \"crc-debug-88hrl\" (UID: \"6e14c844-f118-41c9-bd51-a63b1e5898c5\") " pod="openshift-must-gather-bs75f/crc-debug-88hrl" Feb 25 16:55:58 crc kubenswrapper[4937]: I0225 16:55:58.833263 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/crc-debug-88hrl" Feb 25 16:55:59 crc kubenswrapper[4937]: I0225 16:55:59.177650 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bs75f/crc-debug-88hrl" event={"ID":"6e14c844-f118-41c9-bd51-a63b1e5898c5","Type":"ContainerStarted","Data":"a6119832385c9ba9906e28bc6435114d7cbd12cad0202178bef2c12999fa3241"} Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.146553 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533976-9dkfb"] Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.148670 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533976-9dkfb" Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.151224 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.151639 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.151861 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.163886 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533976-9dkfb"] Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.193243 4937 generic.go:334] "Generic (PLEG): container finished" podID="6e14c844-f118-41c9-bd51-a63b1e5898c5" containerID="d1c9eabc71181dddca16742309e7fdb1074738658ad91c8be04aa48175e227da" exitCode=0 Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.193291 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bs75f/crc-debug-88hrl" event={"ID":"6e14c844-f118-41c9-bd51-a63b1e5898c5","Type":"ContainerDied","Data":"d1c9eabc71181dddca16742309e7fdb1074738658ad91c8be04aa48175e227da"} Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.238131 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bs75f/crc-debug-88hrl"] Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.248396 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psx45\" (UniqueName: \"kubernetes.io/projected/50e723a3-7bd5-483a-8678-02a8df3a405b-kube-api-access-psx45\") pod \"auto-csr-approver-29533976-9dkfb\" (UID: \"50e723a3-7bd5-483a-8678-02a8df3a405b\") " pod="openshift-infra/auto-csr-approver-29533976-9dkfb" Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.251034 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bs75f/crc-debug-88hrl"] Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.350582 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psx45\" (UniqueName: \"kubernetes.io/projected/50e723a3-7bd5-483a-8678-02a8df3a405b-kube-api-access-psx45\") pod \"auto-csr-approver-29533976-9dkfb\" (UID: \"50e723a3-7bd5-483a-8678-02a8df3a405b\") " pod="openshift-infra/auto-csr-approver-29533976-9dkfb" Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.372209 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psx45\" (UniqueName: \"kubernetes.io/projected/50e723a3-7bd5-483a-8678-02a8df3a405b-kube-api-access-psx45\") pod \"auto-csr-approver-29533976-9dkfb\" (UID: \"50e723a3-7bd5-483a-8678-02a8df3a405b\") " pod="openshift-infra/auto-csr-approver-29533976-9dkfb" Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.470514 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533976-9dkfb" Feb 25 16:56:00 crc kubenswrapper[4937]: I0225 16:56:00.968565 4937 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 16:56:01 crc kubenswrapper[4937]: I0225 16:56:01.006135 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533976-9dkfb"] Feb 25 16:56:01 crc kubenswrapper[4937]: I0225 16:56:01.202330 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533976-9dkfb" event={"ID":"50e723a3-7bd5-483a-8678-02a8df3a405b","Type":"ContainerStarted","Data":"e45e04ae8b9abd4654f501fdc347dcfd719de616184fee8585367ac12bdba492"} Feb 25 16:56:01 crc kubenswrapper[4937]: I0225 16:56:01.265540 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/crc-debug-88hrl" Feb 25 16:56:01 crc kubenswrapper[4937]: I0225 16:56:01.377573 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsft\" (UniqueName: \"kubernetes.io/projected/6e14c844-f118-41c9-bd51-a63b1e5898c5-kube-api-access-dbsft\") pod \"6e14c844-f118-41c9-bd51-a63b1e5898c5\" (UID: \"6e14c844-f118-41c9-bd51-a63b1e5898c5\") " Feb 25 16:56:01 crc kubenswrapper[4937]: I0225 16:56:01.379326 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e14c844-f118-41c9-bd51-a63b1e5898c5-host\") pod \"6e14c844-f118-41c9-bd51-a63b1e5898c5\" (UID: \"6e14c844-f118-41c9-bd51-a63b1e5898c5\") " Feb 25 16:56:01 crc kubenswrapper[4937]: I0225 16:56:01.379414 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e14c844-f118-41c9-bd51-a63b1e5898c5-host" (OuterVolumeSpecName: "host") pod "6e14c844-f118-41c9-bd51-a63b1e5898c5" (UID: "6e14c844-f118-41c9-bd51-a63b1e5898c5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 16:56:01 crc kubenswrapper[4937]: I0225 16:56:01.380769 4937 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e14c844-f118-41c9-bd51-a63b1e5898c5-host\") on node \"crc\" DevicePath \"\"" Feb 25 16:56:01 crc kubenswrapper[4937]: I0225 16:56:01.384112 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e14c844-f118-41c9-bd51-a63b1e5898c5-kube-api-access-dbsft" (OuterVolumeSpecName: "kube-api-access-dbsft") pod "6e14c844-f118-41c9-bd51-a63b1e5898c5" (UID: "6e14c844-f118-41c9-bd51-a63b1e5898c5"). InnerVolumeSpecName "kube-api-access-dbsft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:56:01 crc kubenswrapper[4937]: I0225 16:56:01.387119 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e14c844-f118-41c9-bd51-a63b1e5898c5" path="/var/lib/kubelet/pods/6e14c844-f118-41c9-bd51-a63b1e5898c5/volumes" Feb 25 16:56:01 crc kubenswrapper[4937]: I0225 16:56:01.482362 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsft\" (UniqueName: \"kubernetes.io/projected/6e14c844-f118-41c9-bd51-a63b1e5898c5-kube-api-access-dbsft\") on node \"crc\" DevicePath \"\"" Feb 25 16:56:02 crc kubenswrapper[4937]: I0225 16:56:02.228975 4937 scope.go:117] "RemoveContainer" containerID="d1c9eabc71181dddca16742309e7fdb1074738658ad91c8be04aa48175e227da" Feb 25 16:56:02 crc kubenswrapper[4937]: I0225 16:56:02.229112 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/crc-debug-88hrl" Feb 25 16:56:03 crc kubenswrapper[4937]: I0225 16:56:03.243326 4937 generic.go:334] "Generic (PLEG): container finished" podID="50e723a3-7bd5-483a-8678-02a8df3a405b" containerID="0bf340e507d0b47f56a4b858fc4de408aacf6bd50c6d07c793853b8ca29be45e" exitCode=0 Feb 25 16:56:03 crc kubenswrapper[4937]: I0225 16:56:03.243370 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533976-9dkfb" event={"ID":"50e723a3-7bd5-483a-8678-02a8df3a405b","Type":"ContainerDied","Data":"0bf340e507d0b47f56a4b858fc4de408aacf6bd50c6d07c793853b8ca29be45e"} Feb 25 16:56:04 crc kubenswrapper[4937]: I0225 16:56:04.950934 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533976-9dkfb" Feb 25 16:56:05 crc kubenswrapper[4937]: I0225 16:56:05.056457 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psx45\" (UniqueName: \"kubernetes.io/projected/50e723a3-7bd5-483a-8678-02a8df3a405b-kube-api-access-psx45\") pod \"50e723a3-7bd5-483a-8678-02a8df3a405b\" (UID: \"50e723a3-7bd5-483a-8678-02a8df3a405b\") " Feb 25 16:56:05 crc kubenswrapper[4937]: I0225 16:56:05.062102 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e723a3-7bd5-483a-8678-02a8df3a405b-kube-api-access-psx45" (OuterVolumeSpecName: "kube-api-access-psx45") pod "50e723a3-7bd5-483a-8678-02a8df3a405b" (UID: "50e723a3-7bd5-483a-8678-02a8df3a405b"). InnerVolumeSpecName "kube-api-access-psx45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:56:05 crc kubenswrapper[4937]: I0225 16:56:05.159455 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psx45\" (UniqueName: \"kubernetes.io/projected/50e723a3-7bd5-483a-8678-02a8df3a405b-kube-api-access-psx45\") on node \"crc\" DevicePath \"\"" Feb 25 16:56:05 crc kubenswrapper[4937]: I0225 16:56:05.265395 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533976-9dkfb" event={"ID":"50e723a3-7bd5-483a-8678-02a8df3a405b","Type":"ContainerDied","Data":"e45e04ae8b9abd4654f501fdc347dcfd719de616184fee8585367ac12bdba492"} Feb 25 16:56:05 crc kubenswrapper[4937]: I0225 16:56:05.265440 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e45e04ae8b9abd4654f501fdc347dcfd719de616184fee8585367ac12bdba492" Feb 25 16:56:05 crc kubenswrapper[4937]: I0225 16:56:05.265521 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533976-9dkfb" Feb 25 16:56:06 crc kubenswrapper[4937]: I0225 16:56:06.026359 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533970-v9vrz"] Feb 25 16:56:06 crc kubenswrapper[4937]: I0225 16:56:06.035084 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533970-v9vrz"] Feb 25 16:56:07 crc kubenswrapper[4937]: I0225 16:56:07.386743 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e02b8833-357e-4e00-b36c-037e417e3acf" path="/var/lib/kubelet/pods/e02b8833-357e-4e00-b36c-037e417e3acf/volumes" Feb 25 16:56:11 crc kubenswrapper[4937]: I0225 16:56:11.494635 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:56:11 crc kubenswrapper[4937]: I0225 16:56:11.495242 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:56:41 crc kubenswrapper[4937]: I0225 16:56:41.495420 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:56:41 crc kubenswrapper[4937]: I0225 16:56:41.496335 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:56:44 crc kubenswrapper[4937]: I0225 16:56:44.122125 4937 scope.go:117] "RemoveContainer" containerID="29cb8ae0c5f659341c1483f0355d886968e5310df8559408592bec347aeaa2fb" Feb 25 16:56:45 crc kubenswrapper[4937]: I0225 16:56:45.230873 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d773f4d2-bec3-4379-a7a2-29975a18c85b/init-config-reloader/0.log" Feb 25 16:56:45 crc kubenswrapper[4937]: I0225 16:56:45.341107 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d773f4d2-bec3-4379-a7a2-29975a18c85b/init-config-reloader/0.log" Feb 25 16:56:45 crc kubenswrapper[4937]: I0225 16:56:45.387060 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d773f4d2-bec3-4379-a7a2-29975a18c85b/alertmanager/0.log" Feb 25 16:56:45 crc kubenswrapper[4937]: I0225 16:56:45.435300 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d773f4d2-bec3-4379-a7a2-29975a18c85b/config-reloader/0.log" Feb 25 16:56:45 crc kubenswrapper[4937]: I0225 16:56:45.593616 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c757b5c5d-sqs2g_11e5858d-ee0a-4f76-8863-25be5ef4df36/barbican-api/0.log" Feb 25 16:56:45 crc kubenswrapper[4937]: I0225 16:56:45.645351 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c757b5c5d-sqs2g_11e5858d-ee0a-4f76-8863-25be5ef4df36/barbican-api-log/0.log" Feb 25 16:56:45 crc kubenswrapper[4937]: I0225 16:56:45.940446 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6b99d98bc-2r54q_394cbe6e-1697-449d-abaf-68e9ba275096/barbican-worker/0.log" Feb 25 16:56:46 crc kubenswrapper[4937]: I0225 16:56:46.110878 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6b99d98bc-2r54q_394cbe6e-1697-449d-abaf-68e9ba275096/barbican-worker-log/0.log" Feb 25 16:56:46 crc kubenswrapper[4937]: I0225 16:56:46.198136 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6b9499bbcd-kr7kb_6520b70f-9bf6-4b3c-ad1e-4f43da8daec5/barbican-keystone-listener/0.log" Feb 25 16:56:46 crc kubenswrapper[4937]: I0225 16:56:46.259860 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6b9499bbcd-kr7kb_6520b70f-9bf6-4b3c-ad1e-4f43da8daec5/barbican-keystone-listener-log/0.log" Feb 25 16:56:46 crc kubenswrapper[4937]: I0225 16:56:46.806507 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-dffzc_165310e1-208b-4a29-a8fd-be630d60fc08/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:56:46 crc kubenswrapper[4937]: I0225 16:56:46.926075 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4d9d51be-46d2-4d06-8f81-f34e8693e52d/ceilometer-central-agent/0.log" Feb 25 16:56:46 crc kubenswrapper[4937]: I0225 16:56:46.942917 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4d9d51be-46d2-4d06-8f81-f34e8693e52d/ceilometer-notification-agent/0.log" Feb 25 16:56:47 crc kubenswrapper[4937]: I0225 16:56:47.036946 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4d9d51be-46d2-4d06-8f81-f34e8693e52d/proxy-httpd/0.log" Feb 25 16:56:47 crc kubenswrapper[4937]: I0225 16:56:47.121065 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4d9d51be-46d2-4d06-8f81-f34e8693e52d/sg-core/0.log" Feb 25 16:56:47 crc kubenswrapper[4937]: I0225 16:56:47.211624 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5716a1da-2a42-48cd-96cd-149adb030006/cinder-api/0.log" Feb 25 16:56:47 crc kubenswrapper[4937]: I0225 16:56:47.281073 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5716a1da-2a42-48cd-96cd-149adb030006/cinder-api-log/0.log" Feb 25 16:56:47 crc kubenswrapper[4937]: I0225 16:56:47.458212 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f6b49871-d3a3-4846-9c5d-3df7b920a420/probe/0.log" Feb 25 16:56:47 crc kubenswrapper[4937]: I0225 16:56:47.486712 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f6b49871-d3a3-4846-9c5d-3df7b920a420/cinder-scheduler/0.log" Feb 25 16:56:47 crc kubenswrapper[4937]: I0225 16:56:47.766971 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_5787788d-bec1-4541-a34d-26ab6b7f4aa5/cloudkitty-api-log/0.log" Feb 25 16:56:47 crc kubenswrapper[4937]: I0225 16:56:47.779686 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_5787788d-bec1-4541-a34d-26ab6b7f4aa5/cloudkitty-api/0.log" Feb 25 16:56:48 crc kubenswrapper[4937]: I0225 16:56:48.267329 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_123b3439-f7ab-44b4-bbed-02539668cf80/loki-compactor/0.log" Feb 25 16:56:48 crc kubenswrapper[4937]: I0225 16:56:48.355642 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-585d9bcbc-mplc7_48e5f2c6-d4ed-48a1-8737-693b54c43613/loki-distributor/0.log" Feb 25 16:56:48 crc kubenswrapper[4937]: I0225 16:56:48.508415 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-t9wss_e13a5d7c-5a1a-466b-83a0-d76859e2cd3e/gateway/0.log" Feb 25 16:56:48 crc kubenswrapper[4937]: I0225 16:56:48.536009 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-v7lt7_836ae71a-cf0f-4a00-a0bc-78d1be68f830/gateway/0.log" Feb 25 16:56:48 crc kubenswrapper[4937]: I0225 16:56:48.847959 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_c4c8966b-44e5-42fd-ae20-d3099876ee36/loki-index-gateway/0.log" Feb 25 16:56:49 crc kubenswrapper[4937]: I0225 16:56:49.031163 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_08382e6d-e8e5-4656-a524-26c8269114fd/loki-ingester/0.log" Feb 25 16:56:49 crc kubenswrapper[4937]: I0225 16:56:49.263896 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-67bb4dfcd8-dh96p_5d351b94-5168-4f7f-9d70-c2cd2225dba8/loki-query-frontend/0.log" Feb 25 16:56:49 crc kubenswrapper[4937]: I0225 16:56:49.351523 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-58c84b5844-86646_f72068e0-28e8-4c10-abeb-c067fe29c2f4/loki-querier/0.log" Feb 25 16:56:49 crc kubenswrapper[4937]: I0225 16:56:49.520254 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-h5j54_d0caaa2f-df02-4bb7-a490-f3333d6c47a2/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:56:49 crc kubenswrapper[4937]: I0225 16:56:49.839799 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-2pqjp_1bd696e6-be36-4b9e-9f00-9ba293305842/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:56:50 crc kubenswrapper[4937]: I0225 16:56:50.241491 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-hjwwn_4ec5a514-9a47-413c-8e18-113b8295e0b7/init/0.log" Feb 25 16:56:50 crc kubenswrapper[4937]: I0225 16:56:50.424127 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-hjwwn_4ec5a514-9a47-413c-8e18-113b8295e0b7/init/0.log" Feb 25 16:56:50 crc kubenswrapper[4937]: I0225 16:56:50.455375 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-29n8w_e34d42d5-94de-45fe-b002-65da3cd1d49d/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:56:50 crc kubenswrapper[4937]: I0225 16:56:50.514575 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-hjwwn_4ec5a514-9a47-413c-8e18-113b8295e0b7/dnsmasq-dns/0.log" Feb 25 16:56:50 crc kubenswrapper[4937]: I0225 16:56:50.665931 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c50d4693-04e4-40a4-a07d-9475ce9b0125/glance-httpd/0.log" Feb 25 16:56:50 crc kubenswrapper[4937]: I0225 16:56:50.741304 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c50d4693-04e4-40a4-a07d-9475ce9b0125/glance-log/0.log" Feb 25 16:56:50 crc kubenswrapper[4937]: I0225 16:56:50.897698 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_55cf40d2-3819-46a7-b9c1-aad7f3a65542/glance-httpd/0.log" Feb 25 16:56:50 crc kubenswrapper[4937]: I0225 16:56:50.922261 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_55cf40d2-3819-46a7-b9c1-aad7f3a65542/glance-log/0.log" Feb 25 16:56:51 crc kubenswrapper[4937]: I0225 16:56:51.120963 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-r74cs_d08f7150-84a3-42bf-bed8-624a7f5e2c35/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:56:51 crc kubenswrapper[4937]: I0225 16:56:51.326774 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-2mgkn_4784f56a-332c-45b1-b121-ec925aece823/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:56:51 crc kubenswrapper[4937]: I0225 16:56:51.543309 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_b0312225-730b-46a3-8142-6a39e9d69f60/kube-state-metrics/0.log" Feb 25 16:56:51 crc kubenswrapper[4937]: I0225 16:56:51.806462 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-bzxct_b1499078-381f-48bd-bcfb-c9bd057fa5d2/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:56:51 crc kubenswrapper[4937]: I0225 16:56:51.852494 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7595479948-g6dtl_e3d2f89f-c1af-45d5-bfdd-6f9c3141c124/keystone-api/0.log" Feb 25 16:56:52 crc kubenswrapper[4937]: I0225 16:56:52.218447 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57ff6d8577-ntrmb_351a0bd5-2cd4-4f52-af68-6d86a512add0/neutron-httpd/0.log" Feb 25 16:56:52 crc kubenswrapper[4937]: I0225 16:56:52.347687 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57ff6d8577-ntrmb_351a0bd5-2cd4-4f52-af68-6d86a512add0/neutron-api/0.log" Feb 25 16:56:52 crc kubenswrapper[4937]: I0225 16:56:52.476099 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-xhwss_9851d2ed-9455-4797-bcad-ed3b82909df5/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:56:53 crc kubenswrapper[4937]: I0225 16:56:53.126376 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_e90a8e88-5fc8-48fe-af70-c6f6553d8b62/nova-api-log/0.log" Feb 25 16:56:53 crc kubenswrapper[4937]: I0225 16:56:53.585557 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c69df688-29bc-47e8-98ef-56b506f9e7c1/nova-cell0-conductor-conductor/0.log" Feb 25 16:56:53 crc kubenswrapper[4937]: I0225 16:56:53.646406 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_e90a8e88-5fc8-48fe-af70-c6f6553d8b62/nova-api-api/0.log" Feb 25 16:56:53 crc kubenswrapper[4937]: I0225 16:56:53.904779 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f95d4ab5-6b4e-477f-848e-0b98b93c8ba1/nova-cell1-conductor-conductor/0.log" Feb 25 16:56:54 crc kubenswrapper[4937]: I0225 16:56:54.054737 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d447480a-3bd1-4934-9ba7-73122b37df7c/nova-cell1-novncproxy-novncproxy/0.log" Feb 25 16:56:54 crc kubenswrapper[4937]: I0225 16:56:54.155710 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-9w4zb_dbc3ffd6-39f1-4130-9083-033d890d558d/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:56:54 crc kubenswrapper[4937]: I0225 16:56:54.490139 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9d7981f5-bfaf-41a0-a577-ab25b40dc375/nova-metadata-log/0.log" Feb 25 16:56:55 crc kubenswrapper[4937]: I0225 16:56:55.054075 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_987df1b8-49a2-4ec2-b92d-64619d55b516/nova-scheduler-scheduler/0.log" Feb 25 16:56:55 crc kubenswrapper[4937]: I0225 16:56:55.183956 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9e2484d7-6d50-43d2-9105-e83280f565ac/mysql-bootstrap/0.log" Feb 25 16:56:55 crc kubenswrapper[4937]: I0225 16:56:55.366982 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9e2484d7-6d50-43d2-9105-e83280f565ac/mysql-bootstrap/0.log" Feb 25 16:56:55 crc kubenswrapper[4937]: I0225 16:56:55.441971 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9e2484d7-6d50-43d2-9105-e83280f565ac/galera/0.log" Feb 25 16:56:55 crc kubenswrapper[4937]: I0225 16:56:55.754367 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe/mysql-bootstrap/0.log" Feb 25 16:56:55 crc kubenswrapper[4937]: I0225 16:56:55.998245 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9d7981f5-bfaf-41a0-a577-ab25b40dc375/nova-metadata-metadata/0.log" Feb 25 16:56:56 crc kubenswrapper[4937]: I0225 16:56:56.025631 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe/mysql-bootstrap/0.log" Feb 25 16:56:56 crc kubenswrapper[4937]: I0225 16:56:56.035799 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e0d3c0ee-079b-4bdc-9fdb-5d796b88b7fe/galera/0.log" Feb 25 16:56:56 crc kubenswrapper[4937]: I0225 16:56:56.247226 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a03635e4-24a3-460b-ab0e-e3f677ac95c5/openstackclient/0.log" Feb 25 16:56:56 crc kubenswrapper[4937]: I0225 16:56:56.659728 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-lkm6g_d9c73995-8885-40f2-8491-6216d1ec5c7b/openstack-network-exporter/0.log" Feb 25 16:56:56 crc kubenswrapper[4937]: I0225 16:56:56.850429 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4rsxl_388f0d04-d580-46ae-a729-667d81ad11a0/ovsdb-server-init/0.log" Feb 25 16:56:57 crc kubenswrapper[4937]: I0225 16:56:57.131414 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4rsxl_388f0d04-d580-46ae-a729-667d81ad11a0/ovsdb-server-init/0.log" Feb 25 16:56:57 crc kubenswrapper[4937]: I0225 16:56:57.165942 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4rsxl_388f0d04-d580-46ae-a729-667d81ad11a0/ovs-vswitchd/0.log" Feb 25 16:56:57 crc kubenswrapper[4937]: I0225 16:56:57.215779 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4rsxl_388f0d04-d580-46ae-a729-667d81ad11a0/ovsdb-server/0.log" Feb 25 16:56:57 crc kubenswrapper[4937]: I0225 16:56:57.771818 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-sgdhb_c0b0baed-3140-4ac4-9d27-e8fc15c390c2/ovn-controller/0.log" Feb 25 16:56:58 crc kubenswrapper[4937]: I0225 16:56:58.133280 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-4mrhf_af8ba197-d732-4514-9b22-4d2aa6f5a7f6/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:56:58 crc kubenswrapper[4937]: I0225 16:56:58.175675 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8ce79682-d3ee-4afb-ba50-fdacc0fe6910/openstack-network-exporter/0.log" Feb 25 16:56:58 crc kubenswrapper[4937]: I0225 16:56:58.401274 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8ce79682-d3ee-4afb-ba50-fdacc0fe6910/ovn-northd/0.log" Feb 25 16:56:58 crc kubenswrapper[4937]: I0225 16:56:58.443766 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0a044be7-a58d-4684-8252-5a850694fb04/openstack-network-exporter/0.log" Feb 25 16:56:58 crc kubenswrapper[4937]: I0225 16:56:58.591399 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0a044be7-a58d-4684-8252-5a850694fb04/ovsdbserver-nb/0.log" Feb 25 16:56:58 crc kubenswrapper[4937]: I0225 16:56:58.669518 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08/openstack-network-exporter/0.log" Feb 25 16:56:59 crc kubenswrapper[4937]: I0225 16:56:59.460662 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a02ae7ff-cbe3-4dae-9c3f-6dd285e0bd08/ovsdbserver-sb/0.log" Feb 25 16:56:59 crc kubenswrapper[4937]: I0225 16:56:59.536524 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-575b75bdd-mz6p6_b9abdfa0-773f-4d50-ae2d-8d7a429b5df7/placement-api/0.log" Feb 25 16:56:59 crc kubenswrapper[4937]: I0225 16:56:59.734889 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-575b75bdd-mz6p6_b9abdfa0-773f-4d50-ae2d-8d7a429b5df7/placement-log/0.log" Feb 25 16:56:59 crc kubenswrapper[4937]: I0225 16:56:59.945478 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a/init-config-reloader/0.log" Feb 25 16:57:00 crc kubenswrapper[4937]: I0225 16:57:00.145874 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a/init-config-reloader/0.log" Feb 25 16:57:00 crc kubenswrapper[4937]: I0225 16:57:00.146969 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a/config-reloader/0.log" Feb 25 16:57:00 crc kubenswrapper[4937]: I0225 16:57:00.181113 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a/prometheus/0.log" Feb 25 16:57:00 crc kubenswrapper[4937]: I0225 16:57:00.445462 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d2dd3e8-6ce0-4f39-9298-bcbb6b32559a/thanos-sidecar/0.log" Feb 25 16:57:00 crc kubenswrapper[4937]: I0225 16:57:00.499847 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ab7e006f-0788-42e5-aee9-543e29514c09/setup-container/0.log" Feb 25 16:57:00 crc kubenswrapper[4937]: I0225 16:57:00.706172 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ab7e006f-0788-42e5-aee9-543e29514c09/rabbitmq/0.log" Feb 25 16:57:00 crc kubenswrapper[4937]: I0225 16:57:00.724684 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ab7e006f-0788-42e5-aee9-543e29514c09/setup-container/0.log" Feb 25 16:57:00 crc kubenswrapper[4937]: I0225 16:57:00.881549 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4779f4bd-7580-49e7-b536-ce3b8c77a8d4/setup-container/0.log" Feb 25 16:57:01 crc kubenswrapper[4937]: I0225 16:57:01.128694 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4779f4bd-7580-49e7-b536-ce3b8c77a8d4/setup-container/0.log" Feb 25 16:57:01 crc kubenswrapper[4937]: I0225 16:57:01.176550 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4779f4bd-7580-49e7-b536-ce3b8c77a8d4/rabbitmq/0.log" Feb 25 16:57:01 crc kubenswrapper[4937]: I0225 16:57:01.440224 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_d41b7062-11e8-401a-a063-8467cf1da4f2/cloudkitty-proc/0.log" Feb 25 16:57:01 crc kubenswrapper[4937]: I0225 16:57:01.443687 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-rd7wv_7edeb14d-a4c4-402a-a45f-b30a6f23ffe9/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:57:01 crc kubenswrapper[4937]: I0225 16:57:01.474023 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-78hr5_9cb9799d-3115-4657-a7f3-18fbcb14a073/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:57:01 crc kubenswrapper[4937]: I0225 16:57:01.694898 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-w6g8j_6ee081e9-3c3e-4bd7-9c7d-a4a917946879/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:57:01 crc kubenswrapper[4937]: I0225 16:57:01.796469 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-hxbrd_f9e83917-e8a9-4ec3-9714-591147de094e/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:57:01 crc kubenswrapper[4937]: I0225 16:57:01.971279 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-v5wvt_2793b4d3-40ec-416d-8a93-0bb9b23ab909/ssh-known-hosts-edpm-deployment/0.log" Feb 25 16:57:02 crc kubenswrapper[4937]: I0225 16:57:02.165909 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-75675bb4d7-q28jd_c0193db4-c078-4d8c-8437-538da8d426d2/proxy-server/0.log" Feb 25 16:57:02 crc kubenswrapper[4937]: I0225 16:57:02.180734 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-75675bb4d7-q28jd_c0193db4-c078-4d8c-8437-538da8d426d2/proxy-httpd/0.log" Feb 25 16:57:02 crc kubenswrapper[4937]: I0225 16:57:02.326299 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-6nkbm_0a0f0530-95e1-4231-9933-bedb49b72a88/swift-ring-rebalance/0.log" Feb 25 16:57:02 crc kubenswrapper[4937]: I0225 16:57:02.442047 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/account-auditor/0.log" Feb 25 16:57:02 crc kubenswrapper[4937]: I0225 16:57:02.558415 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/account-reaper/0.log" Feb 25 16:57:02 crc kubenswrapper[4937]: I0225 16:57:02.663136 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/account-replicator/0.log" Feb 25 16:57:02 crc kubenswrapper[4937]: I0225 16:57:02.758989 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/account-server/0.log" Feb 25 16:57:02 crc kubenswrapper[4937]: I0225 16:57:02.840424 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/container-auditor/0.log" Feb 25 16:57:02 crc kubenswrapper[4937]: I0225 16:57:02.897865 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/container-server/0.log" Feb 25 16:57:02 crc kubenswrapper[4937]: I0225 16:57:02.946924 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/container-replicator/0.log" Feb 25 16:57:02 crc kubenswrapper[4937]: I0225 16:57:02.989415 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/container-updater/0.log" Feb 25 16:57:03 crc kubenswrapper[4937]: I0225 16:57:03.102455 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/object-expirer/0.log" Feb 25 16:57:03 crc kubenswrapper[4937]: I0225 16:57:03.105978 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/object-auditor/0.log" Feb 25 16:57:03 crc kubenswrapper[4937]: I0225 16:57:03.207201 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/object-replicator/0.log" Feb 25 16:57:03 crc kubenswrapper[4937]: I0225 16:57:03.366600 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/object-server/0.log" Feb 25 16:57:03 crc kubenswrapper[4937]: I0225 16:57:03.564302 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/swift-recon-cron/0.log" Feb 25 16:57:03 crc kubenswrapper[4937]: I0225 16:57:03.565444 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/rsync/0.log" Feb 25 16:57:03 crc kubenswrapper[4937]: I0225 16:57:03.569434 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_48d22af0-5579-46fb-889d-fd34e46d26e9/object-updater/0.log" Feb 25 16:57:03 crc kubenswrapper[4937]: I0225 16:57:03.846586 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_994bcbfb-8270-42b1-bc77-6a262f2d29e3/tempest-tests-tempest-tests-runner/0.log" Feb 25 16:57:03 crc kubenswrapper[4937]: I0225 16:57:03.865079 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-g87qr_a6ef0688-25f8-4018-8976-30334bf11136/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:57:04 crc kubenswrapper[4937]: I0225 16:57:04.120279 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-bdx5t_cceb45e3-0685-45fb-b7c3-cf18ccb0649b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 16:57:04 crc kubenswrapper[4937]: I0225 16:57:04.182143 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_f62befdf-83b6-4767-8de5-d552bb54e3f9/test-operator-logs-container/0.log" Feb 25 16:57:07 crc kubenswrapper[4937]: I0225 16:57:07.230633 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_2ddd71fd-4c47-4357-87e7-16a2010a23df/memcached/0.log" Feb 25 16:57:08 crc kubenswrapper[4937]: I0225 16:57:08.890418 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xjq9m"] Feb 25 16:57:08 crc kubenswrapper[4937]: E0225 16:57:08.890863 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e14c844-f118-41c9-bd51-a63b1e5898c5" containerName="container-00" Feb 25 16:57:08 crc kubenswrapper[4937]: I0225 16:57:08.890880 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e14c844-f118-41c9-bd51-a63b1e5898c5" containerName="container-00" Feb 25 16:57:08 crc kubenswrapper[4937]: E0225 16:57:08.890899 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e723a3-7bd5-483a-8678-02a8df3a405b" containerName="oc" Feb 25 16:57:08 crc kubenswrapper[4937]: I0225 16:57:08.890908 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e723a3-7bd5-483a-8678-02a8df3a405b" containerName="oc" Feb 25 16:57:08 crc kubenswrapper[4937]: I0225 16:57:08.891162 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e723a3-7bd5-483a-8678-02a8df3a405b" containerName="oc" Feb 25 16:57:08 crc kubenswrapper[4937]: I0225 16:57:08.891179 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e14c844-f118-41c9-bd51-a63b1e5898c5" containerName="container-00" Feb 25 16:57:08 crc kubenswrapper[4937]: I0225 16:57:08.892684 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:08 crc kubenswrapper[4937]: I0225 16:57:08.905567 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xjq9m"] Feb 25 16:57:09 crc kubenswrapper[4937]: I0225 16:57:09.016547 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4c683a-31bd-4298-a592-0134e07d6065-utilities\") pod \"community-operators-xjq9m\" (UID: \"6c4c683a-31bd-4298-a592-0134e07d6065\") " pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:09 crc kubenswrapper[4937]: I0225 16:57:09.016609 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7twqn\" (UniqueName: \"kubernetes.io/projected/6c4c683a-31bd-4298-a592-0134e07d6065-kube-api-access-7twqn\") pod \"community-operators-xjq9m\" (UID: \"6c4c683a-31bd-4298-a592-0134e07d6065\") " pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:09 crc kubenswrapper[4937]: I0225 16:57:09.016777 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4c683a-31bd-4298-a592-0134e07d6065-catalog-content\") pod \"community-operators-xjq9m\" (UID: \"6c4c683a-31bd-4298-a592-0134e07d6065\") " pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:09 crc kubenswrapper[4937]: I0225 16:57:09.118583 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4c683a-31bd-4298-a592-0134e07d6065-utilities\") pod \"community-operators-xjq9m\" (UID: \"6c4c683a-31bd-4298-a592-0134e07d6065\") " pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:09 crc kubenswrapper[4937]: I0225 16:57:09.118638 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7twqn\" (UniqueName: \"kubernetes.io/projected/6c4c683a-31bd-4298-a592-0134e07d6065-kube-api-access-7twqn\") pod \"community-operators-xjq9m\" (UID: \"6c4c683a-31bd-4298-a592-0134e07d6065\") " pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:09 crc kubenswrapper[4937]: I0225 16:57:09.118688 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4c683a-31bd-4298-a592-0134e07d6065-catalog-content\") pod \"community-operators-xjq9m\" (UID: \"6c4c683a-31bd-4298-a592-0134e07d6065\") " pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:09 crc kubenswrapper[4937]: I0225 16:57:09.119240 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4c683a-31bd-4298-a592-0134e07d6065-catalog-content\") pod \"community-operators-xjq9m\" (UID: \"6c4c683a-31bd-4298-a592-0134e07d6065\") " pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:09 crc kubenswrapper[4937]: I0225 16:57:09.119457 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4c683a-31bd-4298-a592-0134e07d6065-utilities\") pod \"community-operators-xjq9m\" (UID: \"6c4c683a-31bd-4298-a592-0134e07d6065\") " pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:09 crc kubenswrapper[4937]: I0225 16:57:09.148182 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7twqn\" (UniqueName: \"kubernetes.io/projected/6c4c683a-31bd-4298-a592-0134e07d6065-kube-api-access-7twqn\") pod \"community-operators-xjq9m\" (UID: \"6c4c683a-31bd-4298-a592-0134e07d6065\") " pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:09 crc kubenswrapper[4937]: I0225 16:57:09.209717 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:10 crc kubenswrapper[4937]: I0225 16:57:10.018072 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xjq9m"] Feb 25 16:57:10 crc kubenswrapper[4937]: I0225 16:57:10.934307 4937 generic.go:334] "Generic (PLEG): container finished" podID="6c4c683a-31bd-4298-a592-0134e07d6065" containerID="e1a025c5681c2916ac8c0b32fe4d6aea4150efdaeb600df17e0d48873846f537" exitCode=0 Feb 25 16:57:10 crc kubenswrapper[4937]: I0225 16:57:10.934366 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjq9m" event={"ID":"6c4c683a-31bd-4298-a592-0134e07d6065","Type":"ContainerDied","Data":"e1a025c5681c2916ac8c0b32fe4d6aea4150efdaeb600df17e0d48873846f537"} Feb 25 16:57:10 crc kubenswrapper[4937]: I0225 16:57:10.934442 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjq9m" event={"ID":"6c4c683a-31bd-4298-a592-0134e07d6065","Type":"ContainerStarted","Data":"2cad4153b523db3617be7165e05f78d95d9fa146e1eda0ec1bbb00270e9b76a7"} Feb 25 16:57:11 crc kubenswrapper[4937]: I0225 16:57:11.494363 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:57:11 crc kubenswrapper[4937]: I0225 16:57:11.494781 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:57:11 crc kubenswrapper[4937]: I0225 16:57:11.494841 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 16:57:11 crc kubenswrapper[4937]: I0225 16:57:11.495557 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0ce94622471d9659329394fa1b2af5fd3461490cd063464e738059704ab88ac2"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 16:57:11 crc kubenswrapper[4937]: I0225 16:57:11.495604 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://0ce94622471d9659329394fa1b2af5fd3461490cd063464e738059704ab88ac2" gracePeriod=600 Feb 25 16:57:11 crc kubenswrapper[4937]: I0225 16:57:11.944807 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjq9m" event={"ID":"6c4c683a-31bd-4298-a592-0134e07d6065","Type":"ContainerStarted","Data":"c091d86224abe49d300bd072f29f2f9f71ace00d6610a81c9ce4402733e06c74"} Feb 25 16:57:11 crc kubenswrapper[4937]: I0225 16:57:11.947236 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="0ce94622471d9659329394fa1b2af5fd3461490cd063464e738059704ab88ac2" exitCode=0 Feb 25 16:57:11 crc kubenswrapper[4937]: I0225 16:57:11.947277 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"0ce94622471d9659329394fa1b2af5fd3461490cd063464e738059704ab88ac2"} Feb 25 16:57:11 crc kubenswrapper[4937]: I0225 16:57:11.947301 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30"} Feb 25 16:57:11 crc kubenswrapper[4937]: I0225 16:57:11.947316 4937 scope.go:117] "RemoveContainer" containerID="a311f3ebb47e3aa75d7950f168b33532369c0854bd3165f8c418975704f354fd" Feb 25 16:57:13 crc kubenswrapper[4937]: I0225 16:57:13.968200 4937 generic.go:334] "Generic (PLEG): container finished" podID="6c4c683a-31bd-4298-a592-0134e07d6065" containerID="c091d86224abe49d300bd072f29f2f9f71ace00d6610a81c9ce4402733e06c74" exitCode=0 Feb 25 16:57:13 crc kubenswrapper[4937]: I0225 16:57:13.968324 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjq9m" event={"ID":"6c4c683a-31bd-4298-a592-0134e07d6065","Type":"ContainerDied","Data":"c091d86224abe49d300bd072f29f2f9f71ace00d6610a81c9ce4402733e06c74"} Feb 25 16:57:14 crc kubenswrapper[4937]: I0225 16:57:14.980089 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjq9m" event={"ID":"6c4c683a-31bd-4298-a592-0134e07d6065","Type":"ContainerStarted","Data":"205f0b9dfbaf9f5267f8951c1ccf54d1e98330928cfa3b7ee774269e63e9d9ba"} Feb 25 16:57:15 crc kubenswrapper[4937]: I0225 16:57:15.006313 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xjq9m" podStartSLOduration=3.498355429 podStartE2EDuration="7.006293728s" podCreationTimestamp="2026-02-25 16:57:08 +0000 UTC" firstStartedPulling="2026-02-25 16:57:10.936366967 +0000 UTC m=+4281.949758857" lastFinishedPulling="2026-02-25 16:57:14.444305266 +0000 UTC m=+4285.457697156" observedRunningTime="2026-02-25 16:57:14.999931748 +0000 UTC m=+4286.013323638" watchObservedRunningTime="2026-02-25 16:57:15.006293728 +0000 UTC m=+4286.019685618" Feb 25 16:57:19 crc kubenswrapper[4937]: I0225 16:57:19.209989 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:19 crc kubenswrapper[4937]: I0225 16:57:19.210420 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:19 crc kubenswrapper[4937]: I0225 16:57:19.295086 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:20 crc kubenswrapper[4937]: I0225 16:57:20.088679 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:20 crc kubenswrapper[4937]: I0225 16:57:20.144627 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xjq9m"] Feb 25 16:57:22 crc kubenswrapper[4937]: I0225 16:57:22.059186 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xjq9m" podUID="6c4c683a-31bd-4298-a592-0134e07d6065" containerName="registry-server" containerID="cri-o://205f0b9dfbaf9f5267f8951c1ccf54d1e98330928cfa3b7ee774269e63e9d9ba" gracePeriod=2 Feb 25 16:57:22 crc kubenswrapper[4937]: I0225 16:57:22.834015 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:22 crc kubenswrapper[4937]: I0225 16:57:22.906896 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7twqn\" (UniqueName: \"kubernetes.io/projected/6c4c683a-31bd-4298-a592-0134e07d6065-kube-api-access-7twqn\") pod \"6c4c683a-31bd-4298-a592-0134e07d6065\" (UID: \"6c4c683a-31bd-4298-a592-0134e07d6065\") " Feb 25 16:57:22 crc kubenswrapper[4937]: I0225 16:57:22.907064 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4c683a-31bd-4298-a592-0134e07d6065-catalog-content\") pod \"6c4c683a-31bd-4298-a592-0134e07d6065\" (UID: \"6c4c683a-31bd-4298-a592-0134e07d6065\") " Feb 25 16:57:22 crc kubenswrapper[4937]: I0225 16:57:22.907106 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4c683a-31bd-4298-a592-0134e07d6065-utilities\") pod \"6c4c683a-31bd-4298-a592-0134e07d6065\" (UID: \"6c4c683a-31bd-4298-a592-0134e07d6065\") " Feb 25 16:57:22 crc kubenswrapper[4937]: I0225 16:57:22.908703 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c4c683a-31bd-4298-a592-0134e07d6065-utilities" (OuterVolumeSpecName: "utilities") pod "6c4c683a-31bd-4298-a592-0134e07d6065" (UID: "6c4c683a-31bd-4298-a592-0134e07d6065"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:57:22 crc kubenswrapper[4937]: I0225 16:57:22.913651 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c4c683a-31bd-4298-a592-0134e07d6065-kube-api-access-7twqn" (OuterVolumeSpecName: "kube-api-access-7twqn") pod "6c4c683a-31bd-4298-a592-0134e07d6065" (UID: "6c4c683a-31bd-4298-a592-0134e07d6065"). InnerVolumeSpecName "kube-api-access-7twqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:57:22 crc kubenswrapper[4937]: I0225 16:57:22.979639 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c4c683a-31bd-4298-a592-0134e07d6065-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c4c683a-31bd-4298-a592-0134e07d6065" (UID: "6c4c683a-31bd-4298-a592-0134e07d6065"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.010293 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4c683a-31bd-4298-a592-0134e07d6065-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.010352 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4c683a-31bd-4298-a592-0134e07d6065-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.010374 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7twqn\" (UniqueName: \"kubernetes.io/projected/6c4c683a-31bd-4298-a592-0134e07d6065-kube-api-access-7twqn\") on node \"crc\" DevicePath \"\"" Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.069672 4937 generic.go:334] "Generic (PLEG): container finished" podID="6c4c683a-31bd-4298-a592-0134e07d6065" containerID="205f0b9dfbaf9f5267f8951c1ccf54d1e98330928cfa3b7ee774269e63e9d9ba" exitCode=0 Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.069728 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjq9m" event={"ID":"6c4c683a-31bd-4298-a592-0134e07d6065","Type":"ContainerDied","Data":"205f0b9dfbaf9f5267f8951c1ccf54d1e98330928cfa3b7ee774269e63e9d9ba"} Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.069762 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjq9m" event={"ID":"6c4c683a-31bd-4298-a592-0134e07d6065","Type":"ContainerDied","Data":"2cad4153b523db3617be7165e05f78d95d9fa146e1eda0ec1bbb00270e9b76a7"} Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.069780 4937 scope.go:117] "RemoveContainer" containerID="205f0b9dfbaf9f5267f8951c1ccf54d1e98330928cfa3b7ee774269e63e9d9ba" Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.070970 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjq9m" Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.091691 4937 scope.go:117] "RemoveContainer" containerID="c091d86224abe49d300bd072f29f2f9f71ace00d6610a81c9ce4402733e06c74" Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.112224 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xjq9m"] Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.122935 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xjq9m"] Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.126702 4937 scope.go:117] "RemoveContainer" containerID="e1a025c5681c2916ac8c0b32fe4d6aea4150efdaeb600df17e0d48873846f537" Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.169286 4937 scope.go:117] "RemoveContainer" containerID="205f0b9dfbaf9f5267f8951c1ccf54d1e98330928cfa3b7ee774269e63e9d9ba" Feb 25 16:57:23 crc kubenswrapper[4937]: E0225 16:57:23.169864 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"205f0b9dfbaf9f5267f8951c1ccf54d1e98330928cfa3b7ee774269e63e9d9ba\": container with ID starting with 205f0b9dfbaf9f5267f8951c1ccf54d1e98330928cfa3b7ee774269e63e9d9ba not found: ID does not exist" containerID="205f0b9dfbaf9f5267f8951c1ccf54d1e98330928cfa3b7ee774269e63e9d9ba" Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.169904 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"205f0b9dfbaf9f5267f8951c1ccf54d1e98330928cfa3b7ee774269e63e9d9ba"} err="failed to get container status \"205f0b9dfbaf9f5267f8951c1ccf54d1e98330928cfa3b7ee774269e63e9d9ba\": rpc error: code = NotFound desc = could not find container \"205f0b9dfbaf9f5267f8951c1ccf54d1e98330928cfa3b7ee774269e63e9d9ba\": container with ID starting with 205f0b9dfbaf9f5267f8951c1ccf54d1e98330928cfa3b7ee774269e63e9d9ba not found: ID does not exist" Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.169930 4937 scope.go:117] "RemoveContainer" containerID="c091d86224abe49d300bd072f29f2f9f71ace00d6610a81c9ce4402733e06c74" Feb 25 16:57:23 crc kubenswrapper[4937]: E0225 16:57:23.170314 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c091d86224abe49d300bd072f29f2f9f71ace00d6610a81c9ce4402733e06c74\": container with ID starting with c091d86224abe49d300bd072f29f2f9f71ace00d6610a81c9ce4402733e06c74 not found: ID does not exist" containerID="c091d86224abe49d300bd072f29f2f9f71ace00d6610a81c9ce4402733e06c74" Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.170393 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c091d86224abe49d300bd072f29f2f9f71ace00d6610a81c9ce4402733e06c74"} err="failed to get container status \"c091d86224abe49d300bd072f29f2f9f71ace00d6610a81c9ce4402733e06c74\": rpc error: code = NotFound desc = could not find container \"c091d86224abe49d300bd072f29f2f9f71ace00d6610a81c9ce4402733e06c74\": container with ID starting with c091d86224abe49d300bd072f29f2f9f71ace00d6610a81c9ce4402733e06c74 not found: ID does not exist" Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.170463 4937 scope.go:117] "RemoveContainer" containerID="e1a025c5681c2916ac8c0b32fe4d6aea4150efdaeb600df17e0d48873846f537" Feb 25 16:57:23 crc kubenswrapper[4937]: E0225 16:57:23.170708 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1a025c5681c2916ac8c0b32fe4d6aea4150efdaeb600df17e0d48873846f537\": container with ID starting with e1a025c5681c2916ac8c0b32fe4d6aea4150efdaeb600df17e0d48873846f537 not found: ID does not exist" containerID="e1a025c5681c2916ac8c0b32fe4d6aea4150efdaeb600df17e0d48873846f537" Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.170781 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a025c5681c2916ac8c0b32fe4d6aea4150efdaeb600df17e0d48873846f537"} err="failed to get container status \"e1a025c5681c2916ac8c0b32fe4d6aea4150efdaeb600df17e0d48873846f537\": rpc error: code = NotFound desc = could not find container \"e1a025c5681c2916ac8c0b32fe4d6aea4150efdaeb600df17e0d48873846f537\": container with ID starting with e1a025c5681c2916ac8c0b32fe4d6aea4150efdaeb600df17e0d48873846f537 not found: ID does not exist" Feb 25 16:57:23 crc kubenswrapper[4937]: I0225 16:57:23.385973 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c4c683a-31bd-4298-a592-0134e07d6065" path="/var/lib/kubelet/pods/6c4c683a-31bd-4298-a592-0134e07d6065/volumes" Feb 25 16:57:37 crc kubenswrapper[4937]: I0225 16:57:37.716542 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7_7671f573-d466-4764-9094-4cc7250e6d3d/util/0.log" Feb 25 16:57:38 crc kubenswrapper[4937]: I0225 16:57:38.016827 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7_7671f573-d466-4764-9094-4cc7250e6d3d/pull/0.log" Feb 25 16:57:38 crc kubenswrapper[4937]: I0225 16:57:38.036550 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7_7671f573-d466-4764-9094-4cc7250e6d3d/pull/0.log" Feb 25 16:57:38 crc kubenswrapper[4937]: I0225 16:57:38.062051 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7_7671f573-d466-4764-9094-4cc7250e6d3d/util/0.log" Feb 25 16:57:38 crc kubenswrapper[4937]: I0225 16:57:38.217476 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7_7671f573-d466-4764-9094-4cc7250e6d3d/pull/0.log" Feb 25 16:57:38 crc kubenswrapper[4937]: I0225 16:57:38.229320 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7_7671f573-d466-4764-9094-4cc7250e6d3d/util/0.log" Feb 25 16:57:38 crc kubenswrapper[4937]: I0225 16:57:38.251673 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ba429d8584cdfbf202c9283399a2d787f6a1ad5c29cb5c8ee21ee0af38rsm7_7671f573-d466-4764-9094-4cc7250e6d3d/extract/0.log" Feb 25 16:57:38 crc kubenswrapper[4937]: I0225 16:57:38.842924 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-95dsz_5c7c6408-d0c4-42ea-ae7b-e10b49e13355/manager/0.log" Feb 25 16:57:39 crc kubenswrapper[4937]: I0225 16:57:39.200880 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-q92r7_079024f7-46b2-46fa-b96b-e4dca470cb4b/manager/0.log" Feb 25 16:57:39 crc kubenswrapper[4937]: I0225 16:57:39.392588 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-k7z4s_8132d735-0341-43be-93de-730c15511083/manager/0.log" Feb 25 16:57:39 crc kubenswrapper[4937]: I0225 16:57:39.602193 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-lnw2m_7b01abed-0e59-495b-8b5e-2229c8d3215f/manager/0.log" Feb 25 16:57:40 crc kubenswrapper[4937]: I0225 16:57:40.443604 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-vjpzc_88ef567f-e68d-47aa-9788-4307003a77a0/manager/0.log" Feb 25 16:57:40 crc kubenswrapper[4937]: I0225 16:57:40.623710 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-xhxm2_7f4f0820-dd56-4d0b-aa5e-70dcab23e568/manager/0.log" Feb 25 16:57:40 crc kubenswrapper[4937]: I0225 16:57:40.751203 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-lkw74_806dde6d-ac75-47d7-98e2-0ba5959614a3/manager/0.log" Feb 25 16:57:40 crc kubenswrapper[4937]: I0225 16:57:40.941401 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-sw29j_18df78fd-5382-4716-9708-4e669508c898/manager/0.log" Feb 25 16:57:41 crc kubenswrapper[4937]: I0225 16:57:41.819082 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-ntf28_b688ff11-a838-4c26-90bd-974c871f4d44/manager/0.log" Feb 25 16:57:41 crc kubenswrapper[4937]: I0225 16:57:41.835041 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-g82nw_6cb3892f-a950-4dc7-9b9b-0db2876c569d/manager/0.log" Feb 25 16:57:41 crc kubenswrapper[4937]: I0225 16:57:41.939573 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-bz4hc_23514cd7-1535-4c0a-a090-68c39654dad2/manager/0.log" Feb 25 16:57:42 crc kubenswrapper[4937]: I0225 16:57:42.209844 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-n64lk_42c84a2f-b585-49c5-adb6-fb83ffecef77/manager/0.log" Feb 25 16:57:42 crc kubenswrapper[4937]: I0225 16:57:42.222970 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-vh5zk_4e98f637-2524-43db-9b27-4bd68ae19bf4/manager/0.log" Feb 25 16:57:42 crc kubenswrapper[4937]: I0225 16:57:42.385152 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cj4zq9_00b4788a-4566-469f-8731-51700725fea0/manager/0.log" Feb 25 16:57:42 crc kubenswrapper[4937]: I0225 16:57:42.786914 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5f5c559654-md6zv_380b8472-bb7f-421e-8a0a-7da8078b6ecc/operator/0.log" Feb 25 16:57:43 crc kubenswrapper[4937]: I0225 16:57:43.029099 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7hxvd_65bfe4a4-8d7d-48bc-823a-5b388022052f/registry-server/0.log" Feb 25 16:57:43 crc kubenswrapper[4937]: I0225 16:57:43.236790 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-vt5mw_e6bcab89-8beb-4879-8596-3a24805bd835/manager/0.log" Feb 25 16:57:43 crc kubenswrapper[4937]: I0225 16:57:43.402886 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-z2bqs_b8448aa3-7cd0-4732-ad80-99fbefc125a6/manager/0.log" Feb 25 16:57:43 crc kubenswrapper[4937]: I0225 16:57:43.452991 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-hmm4d_c21d7933-3e35-48d9-8946-5ffdcc7a42bf/operator/0.log" Feb 25 16:57:44 crc kubenswrapper[4937]: I0225 16:57:44.041037 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-s8x4b_2dde13f7-ba29-4c24-94e0-052d622fe88c/manager/0.log" Feb 25 16:57:44 crc kubenswrapper[4937]: I0225 16:57:44.640399 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-mtqtq_b8a9d073-1b33-4184-8727-28c957c96e5f/manager/0.log" Feb 25 16:57:44 crc kubenswrapper[4937]: I0225 16:57:44.806554 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-fbcb9db89-8spmv_2007fabb-e6dd-4713-823d-f6a8a3cd41f1/manager/0.log" Feb 25 16:57:45 crc kubenswrapper[4937]: I0225 16:57:45.033863 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-pkrhd_5d73c9f2-ead1-410a-ad35-16b7ba251daa/manager/0.log" Feb 25 16:57:45 crc kubenswrapper[4937]: I0225 16:57:45.051724 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-78747bd5c7-dtngf_eefbad00-59b6-4e7c-b056-ba07663a665f/manager/0.log" Feb 25 16:57:48 crc kubenswrapper[4937]: I0225 16:57:48.991190 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-72hwb_1e2c3857-1279-466f-8da3-ea1f5cf13893/manager/0.log" Feb 25 16:57:56 crc kubenswrapper[4937]: I0225 16:57:56.943254 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6ssq5"] Feb 25 16:57:56 crc kubenswrapper[4937]: E0225 16:57:56.944236 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4c683a-31bd-4298-a592-0134e07d6065" containerName="registry-server" Feb 25 16:57:56 crc kubenswrapper[4937]: I0225 16:57:56.944253 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4c683a-31bd-4298-a592-0134e07d6065" containerName="registry-server" Feb 25 16:57:56 crc kubenswrapper[4937]: E0225 16:57:56.944277 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4c683a-31bd-4298-a592-0134e07d6065" containerName="extract-utilities" Feb 25 16:57:56 crc kubenswrapper[4937]: I0225 16:57:56.944285 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4c683a-31bd-4298-a592-0134e07d6065" containerName="extract-utilities" Feb 25 16:57:56 crc kubenswrapper[4937]: E0225 16:57:56.944334 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4c683a-31bd-4298-a592-0134e07d6065" containerName="extract-content" Feb 25 16:57:56 crc kubenswrapper[4937]: I0225 16:57:56.944342 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4c683a-31bd-4298-a592-0134e07d6065" containerName="extract-content" Feb 25 16:57:56 crc kubenswrapper[4937]: I0225 16:57:56.944625 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c4c683a-31bd-4298-a592-0134e07d6065" containerName="registry-server" Feb 25 16:57:56 crc kubenswrapper[4937]: I0225 16:57:56.946829 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:57:56 crc kubenswrapper[4937]: I0225 16:57:56.969362 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6ssq5"] Feb 25 16:57:57 crc kubenswrapper[4937]: I0225 16:57:57.055088 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1abf696b-d380-4124-a928-223c19fed0ce-catalog-content\") pod \"certified-operators-6ssq5\" (UID: \"1abf696b-d380-4124-a928-223c19fed0ce\") " pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:57:57 crc kubenswrapper[4937]: I0225 16:57:57.055246 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gmkw\" (UniqueName: \"kubernetes.io/projected/1abf696b-d380-4124-a928-223c19fed0ce-kube-api-access-8gmkw\") pod \"certified-operators-6ssq5\" (UID: \"1abf696b-d380-4124-a928-223c19fed0ce\") " pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:57:57 crc kubenswrapper[4937]: I0225 16:57:57.055666 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1abf696b-d380-4124-a928-223c19fed0ce-utilities\") pod \"certified-operators-6ssq5\" (UID: \"1abf696b-d380-4124-a928-223c19fed0ce\") " pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:57:57 crc kubenswrapper[4937]: I0225 16:57:57.157972 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gmkw\" (UniqueName: \"kubernetes.io/projected/1abf696b-d380-4124-a928-223c19fed0ce-kube-api-access-8gmkw\") pod \"certified-operators-6ssq5\" (UID: \"1abf696b-d380-4124-a928-223c19fed0ce\") " pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:57:57 crc kubenswrapper[4937]: I0225 16:57:57.158146 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1abf696b-d380-4124-a928-223c19fed0ce-utilities\") pod \"certified-operators-6ssq5\" (UID: \"1abf696b-d380-4124-a928-223c19fed0ce\") " pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:57:57 crc kubenswrapper[4937]: I0225 16:57:57.158212 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1abf696b-d380-4124-a928-223c19fed0ce-catalog-content\") pod \"certified-operators-6ssq5\" (UID: \"1abf696b-d380-4124-a928-223c19fed0ce\") " pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:57:57 crc kubenswrapper[4937]: I0225 16:57:57.158739 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1abf696b-d380-4124-a928-223c19fed0ce-utilities\") pod \"certified-operators-6ssq5\" (UID: \"1abf696b-d380-4124-a928-223c19fed0ce\") " pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:57:57 crc kubenswrapper[4937]: I0225 16:57:57.158974 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1abf696b-d380-4124-a928-223c19fed0ce-catalog-content\") pod \"certified-operators-6ssq5\" (UID: \"1abf696b-d380-4124-a928-223c19fed0ce\") " pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:57:57 crc kubenswrapper[4937]: I0225 16:57:57.180156 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gmkw\" (UniqueName: \"kubernetes.io/projected/1abf696b-d380-4124-a928-223c19fed0ce-kube-api-access-8gmkw\") pod \"certified-operators-6ssq5\" (UID: \"1abf696b-d380-4124-a928-223c19fed0ce\") " pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:57:57 crc kubenswrapper[4937]: I0225 16:57:57.265339 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:57:57 crc kubenswrapper[4937]: I0225 16:57:57.830064 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6ssq5"] Feb 25 16:57:58 crc kubenswrapper[4937]: I0225 16:57:58.440240 4937 generic.go:334] "Generic (PLEG): container finished" podID="1abf696b-d380-4124-a928-223c19fed0ce" containerID="4744cb9b7c86a97b681bf5f60c9eb45bfdf257eb385e0b7464616a6bb0acae54" exitCode=0 Feb 25 16:57:58 crc kubenswrapper[4937]: I0225 16:57:58.440283 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ssq5" event={"ID":"1abf696b-d380-4124-a928-223c19fed0ce","Type":"ContainerDied","Data":"4744cb9b7c86a97b681bf5f60c9eb45bfdf257eb385e0b7464616a6bb0acae54"} Feb 25 16:57:58 crc kubenswrapper[4937]: I0225 16:57:58.440544 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ssq5" event={"ID":"1abf696b-d380-4124-a928-223c19fed0ce","Type":"ContainerStarted","Data":"a5eab19ec227420e29f375694db3e99ba3853c736f5f18d0357c6ac184a56cd2"} Feb 25 16:57:59 crc kubenswrapper[4937]: I0225 16:57:59.455063 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ssq5" event={"ID":"1abf696b-d380-4124-a928-223c19fed0ce","Type":"ContainerStarted","Data":"a2617fa6338f2de47dd7b314fb679c7f5252c9776067c9fd78e6a085f5be62a4"} Feb 25 16:58:00 crc kubenswrapper[4937]: I0225 16:58:00.145161 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533978-7vkz5"] Feb 25 16:58:00 crc kubenswrapper[4937]: I0225 16:58:00.147289 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533978-7vkz5" Feb 25 16:58:00 crc kubenswrapper[4937]: I0225 16:58:00.149682 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 16:58:00 crc kubenswrapper[4937]: I0225 16:58:00.149781 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 16:58:00 crc kubenswrapper[4937]: I0225 16:58:00.150024 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 16:58:00 crc kubenswrapper[4937]: I0225 16:58:00.156286 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533978-7vkz5"] Feb 25 16:58:00 crc kubenswrapper[4937]: I0225 16:58:00.222336 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j577n\" (UniqueName: \"kubernetes.io/projected/37aa378c-2e15-4213-9554-d54aab5803da-kube-api-access-j577n\") pod \"auto-csr-approver-29533978-7vkz5\" (UID: \"37aa378c-2e15-4213-9554-d54aab5803da\") " pod="openshift-infra/auto-csr-approver-29533978-7vkz5" Feb 25 16:58:00 crc kubenswrapper[4937]: I0225 16:58:00.323647 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j577n\" (UniqueName: \"kubernetes.io/projected/37aa378c-2e15-4213-9554-d54aab5803da-kube-api-access-j577n\") pod \"auto-csr-approver-29533978-7vkz5\" (UID: \"37aa378c-2e15-4213-9554-d54aab5803da\") " pod="openshift-infra/auto-csr-approver-29533978-7vkz5" Feb 25 16:58:00 crc kubenswrapper[4937]: I0225 16:58:00.346424 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j577n\" (UniqueName: \"kubernetes.io/projected/37aa378c-2e15-4213-9554-d54aab5803da-kube-api-access-j577n\") pod \"auto-csr-approver-29533978-7vkz5\" (UID: \"37aa378c-2e15-4213-9554-d54aab5803da\") " pod="openshift-infra/auto-csr-approver-29533978-7vkz5" Feb 25 16:58:00 crc kubenswrapper[4937]: I0225 16:58:00.464348 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533978-7vkz5" Feb 25 16:58:00 crc kubenswrapper[4937]: I0225 16:58:00.960389 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533978-7vkz5"] Feb 25 16:58:01 crc kubenswrapper[4937]: I0225 16:58:01.475998 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533978-7vkz5" event={"ID":"37aa378c-2e15-4213-9554-d54aab5803da","Type":"ContainerStarted","Data":"d862362f2e13b8050583e779ac9b9c85b49f8b3b1e9280626a320b11be69a38e"} Feb 25 16:58:05 crc kubenswrapper[4937]: I0225 16:58:05.529544 4937 generic.go:334] "Generic (PLEG): container finished" podID="1abf696b-d380-4124-a928-223c19fed0ce" containerID="a2617fa6338f2de47dd7b314fb679c7f5252c9776067c9fd78e6a085f5be62a4" exitCode=0 Feb 25 16:58:05 crc kubenswrapper[4937]: I0225 16:58:05.529621 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ssq5" event={"ID":"1abf696b-d380-4124-a928-223c19fed0ce","Type":"ContainerDied","Data":"a2617fa6338f2de47dd7b314fb679c7f5252c9776067c9fd78e6a085f5be62a4"} Feb 25 16:58:05 crc kubenswrapper[4937]: I0225 16:58:05.532258 4937 generic.go:334] "Generic (PLEG): container finished" podID="37aa378c-2e15-4213-9554-d54aab5803da" containerID="5e8b62139f79705c5364ccedc10bddd3925ed9baab4662f7ce5641fd2a2be0fe" exitCode=0 Feb 25 16:58:05 crc kubenswrapper[4937]: I0225 16:58:05.532334 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533978-7vkz5" event={"ID":"37aa378c-2e15-4213-9554-d54aab5803da","Type":"ContainerDied","Data":"5e8b62139f79705c5364ccedc10bddd3925ed9baab4662f7ce5641fd2a2be0fe"} Feb 25 16:58:07 crc kubenswrapper[4937]: I0225 16:58:07.219722 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533978-7vkz5" Feb 25 16:58:07 crc kubenswrapper[4937]: I0225 16:58:07.264649 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j577n\" (UniqueName: \"kubernetes.io/projected/37aa378c-2e15-4213-9554-d54aab5803da-kube-api-access-j577n\") pod \"37aa378c-2e15-4213-9554-d54aab5803da\" (UID: \"37aa378c-2e15-4213-9554-d54aab5803da\") " Feb 25 16:58:07 crc kubenswrapper[4937]: I0225 16:58:07.280867 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37aa378c-2e15-4213-9554-d54aab5803da-kube-api-access-j577n" (OuterVolumeSpecName: "kube-api-access-j577n") pod "37aa378c-2e15-4213-9554-d54aab5803da" (UID: "37aa378c-2e15-4213-9554-d54aab5803da"). InnerVolumeSpecName "kube-api-access-j577n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:58:07 crc kubenswrapper[4937]: I0225 16:58:07.366574 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j577n\" (UniqueName: \"kubernetes.io/projected/37aa378c-2e15-4213-9554-d54aab5803da-kube-api-access-j577n\") on node \"crc\" DevicePath \"\"" Feb 25 16:58:07 crc kubenswrapper[4937]: I0225 16:58:07.564947 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ssq5" event={"ID":"1abf696b-d380-4124-a928-223c19fed0ce","Type":"ContainerStarted","Data":"cb0b74b5bd65161f1eafcf9356f01bc7c798457cf53d6418ed9c6cd594468b71"} Feb 25 16:58:07 crc kubenswrapper[4937]: I0225 16:58:07.568357 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533978-7vkz5" event={"ID":"37aa378c-2e15-4213-9554-d54aab5803da","Type":"ContainerDied","Data":"d862362f2e13b8050583e779ac9b9c85b49f8b3b1e9280626a320b11be69a38e"} Feb 25 16:58:07 crc kubenswrapper[4937]: I0225 16:58:07.568408 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d862362f2e13b8050583e779ac9b9c85b49f8b3b1e9280626a320b11be69a38e" Feb 25 16:58:07 crc kubenswrapper[4937]: I0225 16:58:07.568475 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533978-7vkz5" Feb 25 16:58:07 crc kubenswrapper[4937]: I0225 16:58:07.591753 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6ssq5" podStartSLOduration=3.6832222249999997 podStartE2EDuration="11.591735032s" podCreationTimestamp="2026-02-25 16:57:56 +0000 UTC" firstStartedPulling="2026-02-25 16:57:58.442204655 +0000 UTC m=+4329.455596545" lastFinishedPulling="2026-02-25 16:58:06.350717452 +0000 UTC m=+4337.364109352" observedRunningTime="2026-02-25 16:58:07.582200614 +0000 UTC m=+4338.595592524" watchObservedRunningTime="2026-02-25 16:58:07.591735032 +0000 UTC m=+4338.605126922" Feb 25 16:58:08 crc kubenswrapper[4937]: I0225 16:58:08.298755 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533972-8pwxc"] Feb 25 16:58:08 crc kubenswrapper[4937]: I0225 16:58:08.311285 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533972-8pwxc"] Feb 25 16:58:09 crc kubenswrapper[4937]: I0225 16:58:09.379378 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd05df20-2126-4248-bdb5-2e574b56e291" path="/var/lib/kubelet/pods/bd05df20-2126-4248-bdb5-2e574b56e291/volumes" Feb 25 16:58:12 crc kubenswrapper[4937]: I0225 16:58:12.582626 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dz785_9b1dc13b-9b02-42b0-a00e-21f15f9f98a2/control-plane-machine-set-operator/1.log" Feb 25 16:58:12 crc kubenswrapper[4937]: I0225 16:58:12.650347 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dz785_9b1dc13b-9b02-42b0-a00e-21f15f9f98a2/control-plane-machine-set-operator/0.log" Feb 25 16:58:12 crc kubenswrapper[4937]: I0225 16:58:12.829634 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8zn9j_7210df16-765e-4b49-8b67-8989f4b2f15c/kube-rbac-proxy/0.log" Feb 25 16:58:12 crc kubenswrapper[4937]: I0225 16:58:12.881520 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8zn9j_7210df16-765e-4b49-8b67-8989f4b2f15c/machine-api-operator/0.log" Feb 25 16:58:17 crc kubenswrapper[4937]: I0225 16:58:17.265928 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:58:17 crc kubenswrapper[4937]: I0225 16:58:17.266601 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:58:17 crc kubenswrapper[4937]: I0225 16:58:17.317070 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:58:17 crc kubenswrapper[4937]: I0225 16:58:17.705840 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:58:17 crc kubenswrapper[4937]: I0225 16:58:17.762201 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6ssq5"] Feb 25 16:58:19 crc kubenswrapper[4937]: I0225 16:58:19.679147 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6ssq5" podUID="1abf696b-d380-4124-a928-223c19fed0ce" containerName="registry-server" containerID="cri-o://cb0b74b5bd65161f1eafcf9356f01bc7c798457cf53d6418ed9c6cd594468b71" gracePeriod=2 Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.430122 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.529137 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1abf696b-d380-4124-a928-223c19fed0ce-catalog-content\") pod \"1abf696b-d380-4124-a928-223c19fed0ce\" (UID: \"1abf696b-d380-4124-a928-223c19fed0ce\") " Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.529209 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1abf696b-d380-4124-a928-223c19fed0ce-utilities\") pod \"1abf696b-d380-4124-a928-223c19fed0ce\" (UID: \"1abf696b-d380-4124-a928-223c19fed0ce\") " Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.529326 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gmkw\" (UniqueName: \"kubernetes.io/projected/1abf696b-d380-4124-a928-223c19fed0ce-kube-api-access-8gmkw\") pod \"1abf696b-d380-4124-a928-223c19fed0ce\" (UID: \"1abf696b-d380-4124-a928-223c19fed0ce\") " Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.530070 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1abf696b-d380-4124-a928-223c19fed0ce-utilities" (OuterVolumeSpecName: "utilities") pod "1abf696b-d380-4124-a928-223c19fed0ce" (UID: "1abf696b-d380-4124-a928-223c19fed0ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.535430 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1abf696b-d380-4124-a928-223c19fed0ce-kube-api-access-8gmkw" (OuterVolumeSpecName: "kube-api-access-8gmkw") pod "1abf696b-d380-4124-a928-223c19fed0ce" (UID: "1abf696b-d380-4124-a928-223c19fed0ce"). InnerVolumeSpecName "kube-api-access-8gmkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.577833 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1abf696b-d380-4124-a928-223c19fed0ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1abf696b-d380-4124-a928-223c19fed0ce" (UID: "1abf696b-d380-4124-a928-223c19fed0ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.632170 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1abf696b-d380-4124-a928-223c19fed0ce-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.632216 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1abf696b-d380-4124-a928-223c19fed0ce-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.632230 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gmkw\" (UniqueName: \"kubernetes.io/projected/1abf696b-d380-4124-a928-223c19fed0ce-kube-api-access-8gmkw\") on node \"crc\" DevicePath \"\"" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.698332 4937 generic.go:334] "Generic (PLEG): container finished" podID="1abf696b-d380-4124-a928-223c19fed0ce" containerID="cb0b74b5bd65161f1eafcf9356f01bc7c798457cf53d6418ed9c6cd594468b71" exitCode=0 Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.698370 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ssq5" event={"ID":"1abf696b-d380-4124-a928-223c19fed0ce","Type":"ContainerDied","Data":"cb0b74b5bd65161f1eafcf9356f01bc7c798457cf53d6418ed9c6cd594468b71"} Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.698398 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ssq5" event={"ID":"1abf696b-d380-4124-a928-223c19fed0ce","Type":"ContainerDied","Data":"a5eab19ec227420e29f375694db3e99ba3853c736f5f18d0357c6ac184a56cd2"} Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.698416 4937 scope.go:117] "RemoveContainer" containerID="cb0b74b5bd65161f1eafcf9356f01bc7c798457cf53d6418ed9c6cd594468b71" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.698619 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6ssq5" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.725870 4937 scope.go:117] "RemoveContainer" containerID="a2617fa6338f2de47dd7b314fb679c7f5252c9776067c9fd78e6a085f5be62a4" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.742880 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6ssq5"] Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.753436 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6ssq5"] Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.754981 4937 scope.go:117] "RemoveContainer" containerID="4744cb9b7c86a97b681bf5f60c9eb45bfdf257eb385e0b7464616a6bb0acae54" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.816133 4937 scope.go:117] "RemoveContainer" containerID="cb0b74b5bd65161f1eafcf9356f01bc7c798457cf53d6418ed9c6cd594468b71" Feb 25 16:58:20 crc kubenswrapper[4937]: E0225 16:58:20.816798 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0b74b5bd65161f1eafcf9356f01bc7c798457cf53d6418ed9c6cd594468b71\": container with ID starting with cb0b74b5bd65161f1eafcf9356f01bc7c798457cf53d6418ed9c6cd594468b71 not found: ID does not exist" containerID="cb0b74b5bd65161f1eafcf9356f01bc7c798457cf53d6418ed9c6cd594468b71" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.816872 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0b74b5bd65161f1eafcf9356f01bc7c798457cf53d6418ed9c6cd594468b71"} err="failed to get container status \"cb0b74b5bd65161f1eafcf9356f01bc7c798457cf53d6418ed9c6cd594468b71\": rpc error: code = NotFound desc = could not find container \"cb0b74b5bd65161f1eafcf9356f01bc7c798457cf53d6418ed9c6cd594468b71\": container with ID starting with cb0b74b5bd65161f1eafcf9356f01bc7c798457cf53d6418ed9c6cd594468b71 not found: ID does not exist" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.816926 4937 scope.go:117] "RemoveContainer" containerID="a2617fa6338f2de47dd7b314fb679c7f5252c9776067c9fd78e6a085f5be62a4" Feb 25 16:58:20 crc kubenswrapper[4937]: E0225 16:58:20.817387 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2617fa6338f2de47dd7b314fb679c7f5252c9776067c9fd78e6a085f5be62a4\": container with ID starting with a2617fa6338f2de47dd7b314fb679c7f5252c9776067c9fd78e6a085f5be62a4 not found: ID does not exist" containerID="a2617fa6338f2de47dd7b314fb679c7f5252c9776067c9fd78e6a085f5be62a4" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.817430 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2617fa6338f2de47dd7b314fb679c7f5252c9776067c9fd78e6a085f5be62a4"} err="failed to get container status \"a2617fa6338f2de47dd7b314fb679c7f5252c9776067c9fd78e6a085f5be62a4\": rpc error: code = NotFound desc = could not find container \"a2617fa6338f2de47dd7b314fb679c7f5252c9776067c9fd78e6a085f5be62a4\": container with ID starting with a2617fa6338f2de47dd7b314fb679c7f5252c9776067c9fd78e6a085f5be62a4 not found: ID does not exist" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.817459 4937 scope.go:117] "RemoveContainer" containerID="4744cb9b7c86a97b681bf5f60c9eb45bfdf257eb385e0b7464616a6bb0acae54" Feb 25 16:58:20 crc kubenswrapper[4937]: E0225 16:58:20.817873 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4744cb9b7c86a97b681bf5f60c9eb45bfdf257eb385e0b7464616a6bb0acae54\": container with ID starting with 4744cb9b7c86a97b681bf5f60c9eb45bfdf257eb385e0b7464616a6bb0acae54 not found: ID does not exist" containerID="4744cb9b7c86a97b681bf5f60c9eb45bfdf257eb385e0b7464616a6bb0acae54" Feb 25 16:58:20 crc kubenswrapper[4937]: I0225 16:58:20.817904 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4744cb9b7c86a97b681bf5f60c9eb45bfdf257eb385e0b7464616a6bb0acae54"} err="failed to get container status \"4744cb9b7c86a97b681bf5f60c9eb45bfdf257eb385e0b7464616a6bb0acae54\": rpc error: code = NotFound desc = could not find container \"4744cb9b7c86a97b681bf5f60c9eb45bfdf257eb385e0b7464616a6bb0acae54\": container with ID starting with 4744cb9b7c86a97b681bf5f60c9eb45bfdf257eb385e0b7464616a6bb0acae54 not found: ID does not exist" Feb 25 16:58:21 crc kubenswrapper[4937]: I0225 16:58:21.381102 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1abf696b-d380-4124-a928-223c19fed0ce" path="/var/lib/kubelet/pods/1abf696b-d380-4124-a928-223c19fed0ce/volumes" Feb 25 16:58:28 crc kubenswrapper[4937]: I0225 16:58:28.052176 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-tfn2h_92b2442a-04d9-4377-bef2-958d8a72543f/cert-manager-controller/0.log" Feb 25 16:58:28 crc kubenswrapper[4937]: I0225 16:58:28.176895 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-p4vff_99c5f86d-7755-49b0-bb68-7e9a338dbca7/cert-manager-cainjector/0.log" Feb 25 16:58:28 crc kubenswrapper[4937]: I0225 16:58:28.262467 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-m4kc4_2e84eec9-8ff5-4f02-9596-e468e289dba0/cert-manager-webhook/0.log" Feb 25 16:58:44 crc kubenswrapper[4937]: I0225 16:58:44.125534 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-xdc54_ac54500d-8e21-4b21-bb07-9ac1daf6ad08/nmstate-console-plugin/0.log" Feb 25 16:58:44 crc kubenswrapper[4937]: I0225 16:58:44.324792 4937 scope.go:117] "RemoveContainer" containerID="bb81505b111baf8c042ae7d60bcf01d4846b804b1ed8eef3682bd074e5f015ba" Feb 25 16:58:44 crc kubenswrapper[4937]: I0225 16:58:44.340876 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-hb5qm_3c5b69b1-26a3-4de2-9d56-ffc97c64ddad/nmstate-handler/0.log" Feb 25 16:58:44 crc kubenswrapper[4937]: I0225 16:58:44.448084 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-gs4df_6ce93581-d0da-4acc-978d-4c7b936d736b/kube-rbac-proxy/0.log" Feb 25 16:58:44 crc kubenswrapper[4937]: I0225 16:58:44.501800 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-gs4df_6ce93581-d0da-4acc-978d-4c7b936d736b/nmstate-metrics/0.log" Feb 25 16:58:44 crc kubenswrapper[4937]: I0225 16:58:44.735661 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-c4wvk_086feb17-6360-4d8f-a766-78607300c491/nmstate-operator/0.log" Feb 25 16:58:44 crc kubenswrapper[4937]: I0225 16:58:44.749444 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-ljnzj_fee0f5ff-b02d-4a31-921b-e151949932d1/nmstate-webhook/0.log" Feb 25 16:58:59 crc kubenswrapper[4937]: I0225 16:58:59.524327 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-b5f46f5f7-zscl6_b72cc98b-e045-4ade-bdf7-c9929fc489fc/kube-rbac-proxy/0.log" Feb 25 16:58:59 crc kubenswrapper[4937]: I0225 16:58:59.572934 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-b5f46f5f7-zscl6_b72cc98b-e045-4ade-bdf7-c9929fc489fc/manager/0.log" Feb 25 16:59:11 crc kubenswrapper[4937]: I0225 16:59:11.495347 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:59:11 crc kubenswrapper[4937]: I0225 16:59:11.495993 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:59:14 crc kubenswrapper[4937]: I0225 16:59:14.230996 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-46sv9_4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af/prometheus-operator/0.log" Feb 25 16:59:14 crc kubenswrapper[4937]: I0225 16:59:14.682465 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_8f8315c1-97ca-4525-a1a8-afe98581f614/prometheus-operator-admission-webhook/0.log" Feb 25 16:59:14 crc kubenswrapper[4937]: I0225 16:59:14.805706 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_1cad8e2a-5182-4d59-9afa-c64ced98e87b/prometheus-operator-admission-webhook/0.log" Feb 25 16:59:14 crc kubenswrapper[4937]: I0225 16:59:14.953120 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-p5kbh_0eb822b0-826a-4b2d-9376-141a69ba37e5/operator/0.log" Feb 25 16:59:15 crc kubenswrapper[4937]: I0225 16:59:15.001511 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-prw69_26437cd5-3ce5-4d7a-9b7f-9f983015f74d/perses-operator/0.log" Feb 25 16:59:31 crc kubenswrapper[4937]: I0225 16:59:31.434653 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-5vzl9_53cf6067-7864-4449-9f64-2cf8181fec1d/kube-rbac-proxy/0.log" Feb 25 16:59:31 crc kubenswrapper[4937]: I0225 16:59:31.520680 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-5vzl9_53cf6067-7864-4449-9f64-2cf8181fec1d/controller/0.log" Feb 25 16:59:32 crc kubenswrapper[4937]: I0225 16:59:32.291225 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-frr-files/0.log" Feb 25 16:59:32 crc kubenswrapper[4937]: I0225 16:59:32.595209 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-metrics/0.log" Feb 25 16:59:32 crc kubenswrapper[4937]: I0225 16:59:32.598557 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-reloader/0.log" Feb 25 16:59:32 crc kubenswrapper[4937]: I0225 16:59:32.623632 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-reloader/0.log" Feb 25 16:59:32 crc kubenswrapper[4937]: I0225 16:59:32.640422 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-frr-files/0.log" Feb 25 16:59:32 crc kubenswrapper[4937]: I0225 16:59:32.851130 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-frr-files/0.log" Feb 25 16:59:32 crc kubenswrapper[4937]: I0225 16:59:32.851146 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-metrics/0.log" Feb 25 16:59:32 crc kubenswrapper[4937]: I0225 16:59:32.856540 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-reloader/0.log" Feb 25 16:59:32 crc kubenswrapper[4937]: I0225 16:59:32.882679 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-metrics/0.log" Feb 25 16:59:33 crc kubenswrapper[4937]: I0225 16:59:33.022609 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-frr-files/0.log" Feb 25 16:59:33 crc kubenswrapper[4937]: I0225 16:59:33.031326 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-metrics/0.log" Feb 25 16:59:33 crc kubenswrapper[4937]: I0225 16:59:33.041174 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/cp-reloader/0.log" Feb 25 16:59:33 crc kubenswrapper[4937]: I0225 16:59:33.076152 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/controller/0.log" Feb 25 16:59:33 crc kubenswrapper[4937]: I0225 16:59:33.202527 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/frr-metrics/0.log" Feb 25 16:59:33 crc kubenswrapper[4937]: I0225 16:59:33.241976 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/kube-rbac-proxy/0.log" Feb 25 16:59:33 crc kubenswrapper[4937]: I0225 16:59:33.256753 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/kube-rbac-proxy-frr/0.log" Feb 25 16:59:33 crc kubenswrapper[4937]: I0225 16:59:33.515816 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/reloader/0.log" Feb 25 16:59:33 crc kubenswrapper[4937]: I0225 16:59:33.625645 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-zl6xj_f3b9485a-9a4f-467b-9e99-e858b7b47a8b/frr-k8s-webhook-server/0.log" Feb 25 16:59:33 crc kubenswrapper[4937]: I0225 16:59:33.855972 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-678f5df958-zlttq_8ad5751a-e32c-4f13-ab06-b3ddeb681961/manager/0.log" Feb 25 16:59:34 crc kubenswrapper[4937]: I0225 16:59:34.034055 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-59546f7477-2w52w_6faaefa8-4269-448f-90a9-b4af7b5b2eae/webhook-server/0.log" Feb 25 16:59:34 crc kubenswrapper[4937]: I0225 16:59:34.201674 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vpqx7_8c24e8b5-c791-4ceb-9258-fba04c4adf91/kube-rbac-proxy/0.log" Feb 25 16:59:34 crc kubenswrapper[4937]: I0225 16:59:34.812125 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vpqx7_8c24e8b5-c791-4ceb-9258-fba04c4adf91/speaker/0.log" Feb 25 16:59:34 crc kubenswrapper[4937]: I0225 16:59:34.861129 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qlk4x_96b71f98-1da6-4122-828b-1d58fd8e40d3/frr/0.log" Feb 25 16:59:41 crc kubenswrapper[4937]: I0225 16:59:41.496907 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 16:59:41 crc kubenswrapper[4937]: I0225 16:59:41.497366 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 16:59:49 crc kubenswrapper[4937]: I0225 16:59:49.487397 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k_42fd7b47-664a-4b65-8804-417a7fdd9b2f/util/0.log" Feb 25 16:59:49 crc kubenswrapper[4937]: I0225 16:59:49.654884 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k_42fd7b47-664a-4b65-8804-417a7fdd9b2f/util/0.log" Feb 25 16:59:49 crc kubenswrapper[4937]: I0225 16:59:49.666258 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k_42fd7b47-664a-4b65-8804-417a7fdd9b2f/pull/0.log" Feb 25 16:59:49 crc kubenswrapper[4937]: I0225 16:59:49.699110 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k_42fd7b47-664a-4b65-8804-417a7fdd9b2f/pull/0.log" Feb 25 16:59:49 crc kubenswrapper[4937]: I0225 16:59:49.891765 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k_42fd7b47-664a-4b65-8804-417a7fdd9b2f/pull/0.log" Feb 25 16:59:49 crc kubenswrapper[4937]: I0225 16:59:49.896178 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k_42fd7b47-664a-4b65-8804-417a7fdd9b2f/util/0.log" Feb 25 16:59:49 crc kubenswrapper[4937]: I0225 16:59:49.904054 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8229j2k_42fd7b47-664a-4b65-8804-417a7fdd9b2f/extract/0.log" Feb 25 16:59:50 crc kubenswrapper[4937]: I0225 16:59:50.045824 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv_47f85b62-41af-4e45-af61-33526ba0d867/util/0.log" Feb 25 16:59:50 crc kubenswrapper[4937]: I0225 16:59:50.233218 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv_47f85b62-41af-4e45-af61-33526ba0d867/util/0.log" Feb 25 16:59:50 crc kubenswrapper[4937]: I0225 16:59:50.251174 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv_47f85b62-41af-4e45-af61-33526ba0d867/pull/0.log" Feb 25 16:59:50 crc kubenswrapper[4937]: I0225 16:59:50.272924 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv_47f85b62-41af-4e45-af61-33526ba0d867/pull/0.log" Feb 25 16:59:50 crc kubenswrapper[4937]: I0225 16:59:50.921859 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv_47f85b62-41af-4e45-af61-33526ba0d867/util/0.log" Feb 25 16:59:50 crc kubenswrapper[4937]: I0225 16:59:50.943695 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv_47f85b62-41af-4e45-af61-33526ba0d867/extract/0.log" Feb 25 16:59:50 crc kubenswrapper[4937]: I0225 16:59:50.944006 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651m56wv_47f85b62-41af-4e45-af61-33526ba0d867/pull/0.log" Feb 25 16:59:51 crc kubenswrapper[4937]: I0225 16:59:51.090069 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7_6b58d852-ef69-4a94-8e1b-8892612ff7aa/util/0.log" Feb 25 16:59:51 crc kubenswrapper[4937]: I0225 16:59:51.296860 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7_6b58d852-ef69-4a94-8e1b-8892612ff7aa/pull/0.log" Feb 25 16:59:51 crc kubenswrapper[4937]: I0225 16:59:51.304140 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7_6b58d852-ef69-4a94-8e1b-8892612ff7aa/util/0.log" Feb 25 16:59:51 crc kubenswrapper[4937]: I0225 16:59:51.347456 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7_6b58d852-ef69-4a94-8e1b-8892612ff7aa/pull/0.log" Feb 25 16:59:51 crc kubenswrapper[4937]: I0225 16:59:51.561172 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7_6b58d852-ef69-4a94-8e1b-8892612ff7aa/util/0.log" Feb 25 16:59:51 crc kubenswrapper[4937]: I0225 16:59:51.591924 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7_6b58d852-ef69-4a94-8e1b-8892612ff7aa/pull/0.log" Feb 25 16:59:51 crc kubenswrapper[4937]: I0225 16:59:51.610392 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087clt7_6b58d852-ef69-4a94-8e1b-8892612ff7aa/extract/0.log" Feb 25 16:59:51 crc kubenswrapper[4937]: I0225 16:59:51.719169 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4rckv_712d068b-5e64-41fe-bf7b-839866d10ba9/extract-utilities/0.log" Feb 25 16:59:51 crc kubenswrapper[4937]: I0225 16:59:51.907164 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4rckv_712d068b-5e64-41fe-bf7b-839866d10ba9/extract-content/0.log" Feb 25 16:59:51 crc kubenswrapper[4937]: I0225 16:59:51.921074 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4rckv_712d068b-5e64-41fe-bf7b-839866d10ba9/extract-utilities/0.log" Feb 25 16:59:51 crc kubenswrapper[4937]: I0225 16:59:51.931335 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4rckv_712d068b-5e64-41fe-bf7b-839866d10ba9/extract-content/0.log" Feb 25 16:59:52 crc kubenswrapper[4937]: I0225 16:59:52.445628 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4rckv_712d068b-5e64-41fe-bf7b-839866d10ba9/extract-utilities/0.log" Feb 25 16:59:52 crc kubenswrapper[4937]: I0225 16:59:52.492358 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4rckv_712d068b-5e64-41fe-bf7b-839866d10ba9/extract-content/0.log" Feb 25 16:59:52 crc kubenswrapper[4937]: I0225 16:59:52.754104 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6dqwg_18855c35-8e7b-4089-848f-e325b779dc51/extract-utilities/0.log" Feb 25 16:59:52 crc kubenswrapper[4937]: I0225 16:59:52.797999 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4rckv_712d068b-5e64-41fe-bf7b-839866d10ba9/registry-server/0.log" Feb 25 16:59:52 crc kubenswrapper[4937]: I0225 16:59:52.996366 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6dqwg_18855c35-8e7b-4089-848f-e325b779dc51/extract-content/0.log" Feb 25 16:59:53 crc kubenswrapper[4937]: I0225 16:59:53.009794 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6dqwg_18855c35-8e7b-4089-848f-e325b779dc51/extract-utilities/0.log" Feb 25 16:59:53 crc kubenswrapper[4937]: I0225 16:59:53.014920 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6dqwg_18855c35-8e7b-4089-848f-e325b779dc51/extract-content/0.log" Feb 25 16:59:53 crc kubenswrapper[4937]: I0225 16:59:53.185472 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6dqwg_18855c35-8e7b-4089-848f-e325b779dc51/extract-content/0.log" Feb 25 16:59:53 crc kubenswrapper[4937]: I0225 16:59:53.200839 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6dqwg_18855c35-8e7b-4089-848f-e325b779dc51/extract-utilities/0.log" Feb 25 16:59:53 crc kubenswrapper[4937]: I0225 16:59:53.303615 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw_c1d8e5a9-c042-4057-bda5-874d8f7fc926/util/0.log" Feb 25 16:59:53 crc kubenswrapper[4937]: I0225 16:59:53.529019 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw_c1d8e5a9-c042-4057-bda5-874d8f7fc926/util/0.log" Feb 25 16:59:53 crc kubenswrapper[4937]: I0225 16:59:53.566742 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw_c1d8e5a9-c042-4057-bda5-874d8f7fc926/pull/0.log" Feb 25 16:59:53 crc kubenswrapper[4937]: I0225 16:59:53.589030 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw_c1d8e5a9-c042-4057-bda5-874d8f7fc926/pull/0.log" Feb 25 16:59:53 crc kubenswrapper[4937]: I0225 16:59:53.855571 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6dqwg_18855c35-8e7b-4089-848f-e325b779dc51/registry-server/0.log" Feb 25 16:59:53 crc kubenswrapper[4937]: I0225 16:59:53.870381 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw_c1d8e5a9-c042-4057-bda5-874d8f7fc926/pull/0.log" Feb 25 16:59:53 crc kubenswrapper[4937]: I0225 16:59:53.881207 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw_c1d8e5a9-c042-4057-bda5-874d8f7fc926/util/0.log" Feb 25 16:59:53 crc kubenswrapper[4937]: I0225 16:59:53.896720 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4srkrw_c1d8e5a9-c042-4057-bda5-874d8f7fc926/extract/0.log" Feb 25 16:59:54 crc kubenswrapper[4937]: I0225 16:59:54.037008 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-nbj4m_44bb6c0d-ba9f-46a7-86dc-04f6370ab1c6/marketplace-operator/0.log" Feb 25 16:59:54 crc kubenswrapper[4937]: I0225 16:59:54.052593 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8n9q4_22c34722-ce3c-4f34-9a65-3a8ccdbb0673/extract-utilities/0.log" Feb 25 16:59:54 crc kubenswrapper[4937]: I0225 16:59:54.213370 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8n9q4_22c34722-ce3c-4f34-9a65-3a8ccdbb0673/extract-content/0.log" Feb 25 16:59:54 crc kubenswrapper[4937]: I0225 16:59:54.219312 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8n9q4_22c34722-ce3c-4f34-9a65-3a8ccdbb0673/extract-utilities/0.log" Feb 25 16:59:54 crc kubenswrapper[4937]: I0225 16:59:54.248317 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8n9q4_22c34722-ce3c-4f34-9a65-3a8ccdbb0673/extract-content/0.log" Feb 25 16:59:54 crc kubenswrapper[4937]: I0225 16:59:54.412561 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8n9q4_22c34722-ce3c-4f34-9a65-3a8ccdbb0673/extract-utilities/0.log" Feb 25 16:59:54 crc kubenswrapper[4937]: I0225 16:59:54.484821 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8n9q4_22c34722-ce3c-4f34-9a65-3a8ccdbb0673/extract-content/0.log" Feb 25 16:59:54 crc kubenswrapper[4937]: I0225 16:59:54.524574 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fswxh_d1d5578b-cf1d-4208-91b2-2019dff70a16/extract-utilities/0.log" Feb 25 16:59:54 crc kubenswrapper[4937]: I0225 16:59:54.586082 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8n9q4_22c34722-ce3c-4f34-9a65-3a8ccdbb0673/registry-server/0.log" Feb 25 16:59:54 crc kubenswrapper[4937]: I0225 16:59:54.740601 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fswxh_d1d5578b-cf1d-4208-91b2-2019dff70a16/extract-utilities/0.log" Feb 25 16:59:54 crc kubenswrapper[4937]: I0225 16:59:54.770880 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fswxh_d1d5578b-cf1d-4208-91b2-2019dff70a16/extract-content/0.log" Feb 25 16:59:54 crc kubenswrapper[4937]: I0225 16:59:54.787664 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fswxh_d1d5578b-cf1d-4208-91b2-2019dff70a16/extract-content/0.log" Feb 25 16:59:54 crc kubenswrapper[4937]: I0225 16:59:54.992282 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fswxh_d1d5578b-cf1d-4208-91b2-2019dff70a16/extract-utilities/0.log" Feb 25 16:59:55 crc kubenswrapper[4937]: I0225 16:59:55.035927 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fswxh_d1d5578b-cf1d-4208-91b2-2019dff70a16/extract-content/0.log" Feb 25 16:59:55 crc kubenswrapper[4937]: I0225 16:59:55.547559 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fswxh_d1d5578b-cf1d-4208-91b2-2019dff70a16/registry-server/0.log" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.160821 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw"] Feb 25 17:00:00 crc kubenswrapper[4937]: E0225 17:00:00.161939 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abf696b-d380-4124-a928-223c19fed0ce" containerName="extract-utilities" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.161956 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abf696b-d380-4124-a928-223c19fed0ce" containerName="extract-utilities" Feb 25 17:00:00 crc kubenswrapper[4937]: E0225 17:00:00.161966 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37aa378c-2e15-4213-9554-d54aab5803da" containerName="oc" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.161973 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="37aa378c-2e15-4213-9554-d54aab5803da" containerName="oc" Feb 25 17:00:00 crc kubenswrapper[4937]: E0225 17:00:00.161999 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abf696b-d380-4124-a928-223c19fed0ce" containerName="extract-content" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.162006 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abf696b-d380-4124-a928-223c19fed0ce" containerName="extract-content" Feb 25 17:00:00 crc kubenswrapper[4937]: E0225 17:00:00.162030 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abf696b-d380-4124-a928-223c19fed0ce" containerName="registry-server" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.162037 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abf696b-d380-4124-a928-223c19fed0ce" containerName="registry-server" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.162277 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abf696b-d380-4124-a928-223c19fed0ce" containerName="registry-server" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.162302 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="37aa378c-2e15-4213-9554-d54aab5803da" containerName="oc" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.163260 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.165032 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.165818 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.179379 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533980-4p6s6"] Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.180862 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533980-4p6s6" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.182854 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.183778 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.184244 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.190550 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533980-4p6s6"] Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.200682 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw"] Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.225643 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwvt9\" (UniqueName: \"kubernetes.io/projected/ef5d363a-dc74-48eb-93dc-51d63c9fed44-kube-api-access-qwvt9\") pod \"collect-profiles-29533980-mhxvw\" (UID: \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.225753 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef5d363a-dc74-48eb-93dc-51d63c9fed44-secret-volume\") pod \"collect-profiles-29533980-mhxvw\" (UID: \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.225820 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef5d363a-dc74-48eb-93dc-51d63c9fed44-config-volume\") pod \"collect-profiles-29533980-mhxvw\" (UID: \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.225908 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwn4f\" (UniqueName: \"kubernetes.io/projected/e87edea4-e15e-4b00-b34e-5097d7206b4e-kube-api-access-zwn4f\") pod \"auto-csr-approver-29533980-4p6s6\" (UID: \"e87edea4-e15e-4b00-b34e-5097d7206b4e\") " pod="openshift-infra/auto-csr-approver-29533980-4p6s6" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.328430 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwn4f\" (UniqueName: \"kubernetes.io/projected/e87edea4-e15e-4b00-b34e-5097d7206b4e-kube-api-access-zwn4f\") pod \"auto-csr-approver-29533980-4p6s6\" (UID: \"e87edea4-e15e-4b00-b34e-5097d7206b4e\") " pod="openshift-infra/auto-csr-approver-29533980-4p6s6" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.328597 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwvt9\" (UniqueName: \"kubernetes.io/projected/ef5d363a-dc74-48eb-93dc-51d63c9fed44-kube-api-access-qwvt9\") pod \"collect-profiles-29533980-mhxvw\" (UID: \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.328683 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef5d363a-dc74-48eb-93dc-51d63c9fed44-secret-volume\") pod \"collect-profiles-29533980-mhxvw\" (UID: \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.328748 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef5d363a-dc74-48eb-93dc-51d63c9fed44-config-volume\") pod \"collect-profiles-29533980-mhxvw\" (UID: \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.330007 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef5d363a-dc74-48eb-93dc-51d63c9fed44-config-volume\") pod \"collect-profiles-29533980-mhxvw\" (UID: \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.380009 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef5d363a-dc74-48eb-93dc-51d63c9fed44-secret-volume\") pod \"collect-profiles-29533980-mhxvw\" (UID: \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.380511 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwvt9\" (UniqueName: \"kubernetes.io/projected/ef5d363a-dc74-48eb-93dc-51d63c9fed44-kube-api-access-qwvt9\") pod \"collect-profiles-29533980-mhxvw\" (UID: \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.380563 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwn4f\" (UniqueName: \"kubernetes.io/projected/e87edea4-e15e-4b00-b34e-5097d7206b4e-kube-api-access-zwn4f\") pod \"auto-csr-approver-29533980-4p6s6\" (UID: \"e87edea4-e15e-4b00-b34e-5097d7206b4e\") " pod="openshift-infra/auto-csr-approver-29533980-4p6s6" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.489438 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.502265 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533980-4p6s6" Feb 25 17:00:00 crc kubenswrapper[4937]: I0225 17:00:00.986907 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw"] Feb 25 17:00:01 crc kubenswrapper[4937]: I0225 17:00:01.159726 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533980-4p6s6"] Feb 25 17:00:01 crc kubenswrapper[4937]: I0225 17:00:01.680970 4937 generic.go:334] "Generic (PLEG): container finished" podID="ef5d363a-dc74-48eb-93dc-51d63c9fed44" containerID="e185f90da3a1960db986d59aad25bbd1101e94b184b679a066388da78256b170" exitCode=0 Feb 25 17:00:01 crc kubenswrapper[4937]: I0225 17:00:01.681047 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" event={"ID":"ef5d363a-dc74-48eb-93dc-51d63c9fed44","Type":"ContainerDied","Data":"e185f90da3a1960db986d59aad25bbd1101e94b184b679a066388da78256b170"} Feb 25 17:00:01 crc kubenswrapper[4937]: I0225 17:00:01.681919 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" event={"ID":"ef5d363a-dc74-48eb-93dc-51d63c9fed44","Type":"ContainerStarted","Data":"6aa38973b3052793bf57b1aa14b2f917a7d684772c0a406c9c0981f3fbc43779"} Feb 25 17:00:01 crc kubenswrapper[4937]: I0225 17:00:01.683730 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533980-4p6s6" event={"ID":"e87edea4-e15e-4b00-b34e-5097d7206b4e","Type":"ContainerStarted","Data":"bd0a194fc1ef3d6ae7b72d46e9b9c57d4c093d9f98c7ffb36af574e135cec281"} Feb 25 17:00:03 crc kubenswrapper[4937]: I0225 17:00:03.301693 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" Feb 25 17:00:03 crc kubenswrapper[4937]: I0225 17:00:03.404143 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef5d363a-dc74-48eb-93dc-51d63c9fed44-config-volume\") pod \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\" (UID: \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\") " Feb 25 17:00:03 crc kubenswrapper[4937]: I0225 17:00:03.404218 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwvt9\" (UniqueName: \"kubernetes.io/projected/ef5d363a-dc74-48eb-93dc-51d63c9fed44-kube-api-access-qwvt9\") pod \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\" (UID: \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\") " Feb 25 17:00:03 crc kubenswrapper[4937]: I0225 17:00:03.404528 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef5d363a-dc74-48eb-93dc-51d63c9fed44-secret-volume\") pod \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\" (UID: \"ef5d363a-dc74-48eb-93dc-51d63c9fed44\") " Feb 25 17:00:03 crc kubenswrapper[4937]: I0225 17:00:03.405128 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef5d363a-dc74-48eb-93dc-51d63c9fed44-config-volume" (OuterVolumeSpecName: "config-volume") pod "ef5d363a-dc74-48eb-93dc-51d63c9fed44" (UID: "ef5d363a-dc74-48eb-93dc-51d63c9fed44"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 17:00:03 crc kubenswrapper[4937]: I0225 17:00:03.411773 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef5d363a-dc74-48eb-93dc-51d63c9fed44-kube-api-access-qwvt9" (OuterVolumeSpecName: "kube-api-access-qwvt9") pod "ef5d363a-dc74-48eb-93dc-51d63c9fed44" (UID: "ef5d363a-dc74-48eb-93dc-51d63c9fed44"). InnerVolumeSpecName "kube-api-access-qwvt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 17:00:03 crc kubenswrapper[4937]: I0225 17:00:03.413407 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef5d363a-dc74-48eb-93dc-51d63c9fed44-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ef5d363a-dc74-48eb-93dc-51d63c9fed44" (UID: "ef5d363a-dc74-48eb-93dc-51d63c9fed44"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 17:00:03 crc kubenswrapper[4937]: I0225 17:00:03.507335 4937 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef5d363a-dc74-48eb-93dc-51d63c9fed44-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 17:00:03 crc kubenswrapper[4937]: I0225 17:00:03.507373 4937 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef5d363a-dc74-48eb-93dc-51d63c9fed44-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 17:00:03 crc kubenswrapper[4937]: I0225 17:00:03.507387 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwvt9\" (UniqueName: \"kubernetes.io/projected/ef5d363a-dc74-48eb-93dc-51d63c9fed44-kube-api-access-qwvt9\") on node \"crc\" DevicePath \"\"" Feb 25 17:00:03 crc kubenswrapper[4937]: I0225 17:00:03.707972 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" event={"ID":"ef5d363a-dc74-48eb-93dc-51d63c9fed44","Type":"ContainerDied","Data":"6aa38973b3052793bf57b1aa14b2f917a7d684772c0a406c9c0981f3fbc43779"} Feb 25 17:00:03 crc kubenswrapper[4937]: I0225 17:00:03.708529 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aa38973b3052793bf57b1aa14b2f917a7d684772c0a406c9c0981f3fbc43779" Feb 25 17:00:03 crc kubenswrapper[4937]: I0225 17:00:03.708016 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533980-mhxvw" Feb 25 17:00:04 crc kubenswrapper[4937]: I0225 17:00:04.390065 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf"] Feb 25 17:00:04 crc kubenswrapper[4937]: I0225 17:00:04.402963 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533935-jmtbf"] Feb 25 17:00:04 crc kubenswrapper[4937]: I0225 17:00:04.718240 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533980-4p6s6" event={"ID":"e87edea4-e15e-4b00-b34e-5097d7206b4e","Type":"ContainerStarted","Data":"afab612230d80486f74fe83ca0f6a3974a64f1110f38908bd02cc943639c9b3f"} Feb 25 17:00:04 crc kubenswrapper[4937]: I0225 17:00:04.747114 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533980-4p6s6" podStartSLOduration=1.6369667589999999 podStartE2EDuration="4.747089228s" podCreationTimestamp="2026-02-25 17:00:00 +0000 UTC" firstStartedPulling="2026-02-25 17:00:01.173573176 +0000 UTC m=+4452.186965066" lastFinishedPulling="2026-02-25 17:00:04.283695635 +0000 UTC m=+4455.297087535" observedRunningTime="2026-02-25 17:00:04.737644522 +0000 UTC m=+4455.751036412" watchObservedRunningTime="2026-02-25 17:00:04.747089228 +0000 UTC m=+4455.760481118" Feb 25 17:00:05 crc kubenswrapper[4937]: I0225 17:00:05.408182 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7fd2813-1349-41f2-bcee-c2a6650cfafc" path="/var/lib/kubelet/pods/d7fd2813-1349-41f2-bcee-c2a6650cfafc/volumes" Feb 25 17:00:05 crc kubenswrapper[4937]: I0225 17:00:05.740762 4937 generic.go:334] "Generic (PLEG): container finished" podID="e87edea4-e15e-4b00-b34e-5097d7206b4e" containerID="afab612230d80486f74fe83ca0f6a3974a64f1110f38908bd02cc943639c9b3f" exitCode=0 Feb 25 17:00:05 crc kubenswrapper[4937]: I0225 17:00:05.740822 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533980-4p6s6" event={"ID":"e87edea4-e15e-4b00-b34e-5097d7206b4e","Type":"ContainerDied","Data":"afab612230d80486f74fe83ca0f6a3974a64f1110f38908bd02cc943639c9b3f"} Feb 25 17:00:07 crc kubenswrapper[4937]: I0225 17:00:07.419699 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533980-4p6s6" Feb 25 17:00:07 crc kubenswrapper[4937]: I0225 17:00:07.488480 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwn4f\" (UniqueName: \"kubernetes.io/projected/e87edea4-e15e-4b00-b34e-5097d7206b4e-kube-api-access-zwn4f\") pod \"e87edea4-e15e-4b00-b34e-5097d7206b4e\" (UID: \"e87edea4-e15e-4b00-b34e-5097d7206b4e\") " Feb 25 17:00:07 crc kubenswrapper[4937]: I0225 17:00:07.498617 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e87edea4-e15e-4b00-b34e-5097d7206b4e-kube-api-access-zwn4f" (OuterVolumeSpecName: "kube-api-access-zwn4f") pod "e87edea4-e15e-4b00-b34e-5097d7206b4e" (UID: "e87edea4-e15e-4b00-b34e-5097d7206b4e"). InnerVolumeSpecName "kube-api-access-zwn4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 17:00:07 crc kubenswrapper[4937]: I0225 17:00:07.590658 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwn4f\" (UniqueName: \"kubernetes.io/projected/e87edea4-e15e-4b00-b34e-5097d7206b4e-kube-api-access-zwn4f\") on node \"crc\" DevicePath \"\"" Feb 25 17:00:07 crc kubenswrapper[4937]: I0225 17:00:07.764677 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533980-4p6s6" event={"ID":"e87edea4-e15e-4b00-b34e-5097d7206b4e","Type":"ContainerDied","Data":"bd0a194fc1ef3d6ae7b72d46e9b9c57d4c093d9f98c7ffb36af574e135cec281"} Feb 25 17:00:07 crc kubenswrapper[4937]: I0225 17:00:07.764768 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd0a194fc1ef3d6ae7b72d46e9b9c57d4c093d9f98c7ffb36af574e135cec281" Feb 25 17:00:07 crc kubenswrapper[4937]: I0225 17:00:07.764873 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533980-4p6s6" Feb 25 17:00:07 crc kubenswrapper[4937]: I0225 17:00:07.795366 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533974-p9qq8"] Feb 25 17:00:07 crc kubenswrapper[4937]: I0225 17:00:07.807899 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533974-p9qq8"] Feb 25 17:00:09 crc kubenswrapper[4937]: I0225 17:00:09.380316 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03" path="/var/lib/kubelet/pods/eb19ce95-e4c9-4f5f-a682-0bf8c1f94c03/volumes" Feb 25 17:00:10 crc kubenswrapper[4937]: I0225 17:00:10.105187 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-46sv9_4e382460-7f5c-4a5d-ae1f-f1cbbdbfa6af/prometheus-operator/0.log" Feb 25 17:00:10 crc kubenswrapper[4937]: I0225 17:00:10.197051 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-774985d7f8-8xpg9_8f8315c1-97ca-4525-a1a8-afe98581f614/prometheus-operator-admission-webhook/0.log" Feb 25 17:00:10 crc kubenswrapper[4937]: I0225 17:00:10.216754 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-774985d7f8-zbq2b_1cad8e2a-5182-4d59-9afa-c64ced98e87b/prometheus-operator-admission-webhook/0.log" Feb 25 17:00:10 crc kubenswrapper[4937]: I0225 17:00:10.323512 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-p5kbh_0eb822b0-826a-4b2d-9376-141a69ba37e5/operator/0.log" Feb 25 17:00:10 crc kubenswrapper[4937]: I0225 17:00:10.385944 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-prw69_26437cd5-3ce5-4d7a-9b7f-9f983015f74d/perses-operator/0.log" Feb 25 17:00:11 crc kubenswrapper[4937]: I0225 17:00:11.494578 4937 patch_prober.go:28] interesting pod/machine-config-daemon-2r4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 17:00:11 crc kubenswrapper[4937]: I0225 17:00:11.494924 4937 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 17:00:11 crc kubenswrapper[4937]: I0225 17:00:11.494968 4937 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" Feb 25 17:00:11 crc kubenswrapper[4937]: I0225 17:00:11.495714 4937 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30"} pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 17:00:11 crc kubenswrapper[4937]: I0225 17:00:11.495770 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerName="machine-config-daemon" containerID="cri-o://9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" gracePeriod=600 Feb 25 17:00:11 crc kubenswrapper[4937]: E0225 17:00:11.686873 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:00:11 crc kubenswrapper[4937]: I0225 17:00:11.805310 4937 generic.go:334] "Generic (PLEG): container finished" podID="8f826096-fb93-42fe-a779-9afe1d36f2d4" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" exitCode=0 Feb 25 17:00:11 crc kubenswrapper[4937]: I0225 17:00:11.805366 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerDied","Data":"9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30"} Feb 25 17:00:11 crc kubenswrapper[4937]: I0225 17:00:11.805405 4937 scope.go:117] "RemoveContainer" containerID="0ce94622471d9659329394fa1b2af5fd3461490cd063464e738059704ab88ac2" Feb 25 17:00:11 crc kubenswrapper[4937]: I0225 17:00:11.806222 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:00:11 crc kubenswrapper[4937]: E0225 17:00:11.806724 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:00:22 crc kubenswrapper[4937]: I0225 17:00:22.368075 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:00:22 crc kubenswrapper[4937]: E0225 17:00:22.369674 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:00:25 crc kubenswrapper[4937]: I0225 17:00:25.353721 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-b5f46f5f7-zscl6_b72cc98b-e045-4ade-bdf7-c9929fc489fc/kube-rbac-proxy/0.log" Feb 25 17:00:25 crc kubenswrapper[4937]: I0225 17:00:25.432532 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-b5f46f5f7-zscl6_b72cc98b-e045-4ade-bdf7-c9929fc489fc/manager/0.log" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.469584 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4mlwk"] Feb 25 17:00:30 crc kubenswrapper[4937]: E0225 17:00:30.470586 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5d363a-dc74-48eb-93dc-51d63c9fed44" containerName="collect-profiles" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.470598 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5d363a-dc74-48eb-93dc-51d63c9fed44" containerName="collect-profiles" Feb 25 17:00:30 crc kubenswrapper[4937]: E0225 17:00:30.470614 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e87edea4-e15e-4b00-b34e-5097d7206b4e" containerName="oc" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.470620 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="e87edea4-e15e-4b00-b34e-5097d7206b4e" containerName="oc" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.470811 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="e87edea4-e15e-4b00-b34e-5097d7206b4e" containerName="oc" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.470830 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5d363a-dc74-48eb-93dc-51d63c9fed44" containerName="collect-profiles" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.472251 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.484703 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefa14cc-f644-42d0-bdfa-922cb340d691-utilities\") pod \"redhat-marketplace-4mlwk\" (UID: \"cefa14cc-f644-42d0-bdfa-922cb340d691\") " pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.484850 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb8tj\" (UniqueName: \"kubernetes.io/projected/cefa14cc-f644-42d0-bdfa-922cb340d691-kube-api-access-tb8tj\") pod \"redhat-marketplace-4mlwk\" (UID: \"cefa14cc-f644-42d0-bdfa-922cb340d691\") " pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.484904 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefa14cc-f644-42d0-bdfa-922cb340d691-catalog-content\") pod \"redhat-marketplace-4mlwk\" (UID: \"cefa14cc-f644-42d0-bdfa-922cb340d691\") " pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.505057 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mlwk"] Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.587304 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefa14cc-f644-42d0-bdfa-922cb340d691-catalog-content\") pod \"redhat-marketplace-4mlwk\" (UID: \"cefa14cc-f644-42d0-bdfa-922cb340d691\") " pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.587449 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefa14cc-f644-42d0-bdfa-922cb340d691-utilities\") pod \"redhat-marketplace-4mlwk\" (UID: \"cefa14cc-f644-42d0-bdfa-922cb340d691\") " pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.587550 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb8tj\" (UniqueName: \"kubernetes.io/projected/cefa14cc-f644-42d0-bdfa-922cb340d691-kube-api-access-tb8tj\") pod \"redhat-marketplace-4mlwk\" (UID: \"cefa14cc-f644-42d0-bdfa-922cb340d691\") " pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.588367 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefa14cc-f644-42d0-bdfa-922cb340d691-catalog-content\") pod \"redhat-marketplace-4mlwk\" (UID: \"cefa14cc-f644-42d0-bdfa-922cb340d691\") " pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.588400 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefa14cc-f644-42d0-bdfa-922cb340d691-utilities\") pod \"redhat-marketplace-4mlwk\" (UID: \"cefa14cc-f644-42d0-bdfa-922cb340d691\") " pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.611299 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb8tj\" (UniqueName: \"kubernetes.io/projected/cefa14cc-f644-42d0-bdfa-922cb340d691-kube-api-access-tb8tj\") pod \"redhat-marketplace-4mlwk\" (UID: \"cefa14cc-f644-42d0-bdfa-922cb340d691\") " pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:30 crc kubenswrapper[4937]: I0225 17:00:30.797579 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:31 crc kubenswrapper[4937]: I0225 17:00:31.360877 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mlwk"] Feb 25 17:00:32 crc kubenswrapper[4937]: I0225 17:00:32.019838 4937 generic.go:334] "Generic (PLEG): container finished" podID="cefa14cc-f644-42d0-bdfa-922cb340d691" containerID="f6bcb9658f8e28518482a227a0133178c1b92fa3df5541ff94fa8868f90a9271" exitCode=0 Feb 25 17:00:32 crc kubenswrapper[4937]: I0225 17:00:32.020609 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mlwk" event={"ID":"cefa14cc-f644-42d0-bdfa-922cb340d691","Type":"ContainerDied","Data":"f6bcb9658f8e28518482a227a0133178c1b92fa3df5541ff94fa8868f90a9271"} Feb 25 17:00:32 crc kubenswrapper[4937]: I0225 17:00:32.020652 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mlwk" event={"ID":"cefa14cc-f644-42d0-bdfa-922cb340d691","Type":"ContainerStarted","Data":"0860e71c4ae80c419d1eb0045b7471398106acc935509217de1337b044dc96cf"} Feb 25 17:00:33 crc kubenswrapper[4937]: I0225 17:00:33.031994 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mlwk" event={"ID":"cefa14cc-f644-42d0-bdfa-922cb340d691","Type":"ContainerStarted","Data":"9fcea761e272f622deaf58886cca73e75b6dcd40ada5ae60fa3427b7eb80dacf"} Feb 25 17:00:35 crc kubenswrapper[4937]: I0225 17:00:35.051963 4937 generic.go:334] "Generic (PLEG): container finished" podID="cefa14cc-f644-42d0-bdfa-922cb340d691" containerID="9fcea761e272f622deaf58886cca73e75b6dcd40ada5ae60fa3427b7eb80dacf" exitCode=0 Feb 25 17:00:35 crc kubenswrapper[4937]: I0225 17:00:35.052035 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mlwk" event={"ID":"cefa14cc-f644-42d0-bdfa-922cb340d691","Type":"ContainerDied","Data":"9fcea761e272f622deaf58886cca73e75b6dcd40ada5ae60fa3427b7eb80dacf"} Feb 25 17:00:35 crc kubenswrapper[4937]: I0225 17:00:35.367526 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:00:35 crc kubenswrapper[4937]: E0225 17:00:35.368044 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:00:36 crc kubenswrapper[4937]: I0225 17:00:36.074122 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mlwk" event={"ID":"cefa14cc-f644-42d0-bdfa-922cb340d691","Type":"ContainerStarted","Data":"eb2ef53fd85d395e28aaa84816d95c1c1562875b64b4778d1115bbc31ebffe79"} Feb 25 17:00:36 crc kubenswrapper[4937]: I0225 17:00:36.095675 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4mlwk" podStartSLOduration=2.4144958020000002 podStartE2EDuration="6.095655575s" podCreationTimestamp="2026-02-25 17:00:30 +0000 UTC" firstStartedPulling="2026-02-25 17:00:32.028240528 +0000 UTC m=+4483.041632418" lastFinishedPulling="2026-02-25 17:00:35.709400301 +0000 UTC m=+4486.722792191" observedRunningTime="2026-02-25 17:00:36.092593449 +0000 UTC m=+4487.105985339" watchObservedRunningTime="2026-02-25 17:00:36.095655575 +0000 UTC m=+4487.109047465" Feb 25 17:00:40 crc kubenswrapper[4937]: I0225 17:00:40.798230 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:40 crc kubenswrapper[4937]: I0225 17:00:40.798922 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:41 crc kubenswrapper[4937]: I0225 17:00:41.858831 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-4mlwk" podUID="cefa14cc-f644-42d0-bdfa-922cb340d691" containerName="registry-server" probeResult="failure" output=< Feb 25 17:00:41 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 17:00:41 crc kubenswrapper[4937]: > Feb 25 17:00:44 crc kubenswrapper[4937]: I0225 17:00:44.467591 4937 scope.go:117] "RemoveContainer" containerID="44e5fad3959c5a8bcc1467137d12bc42ab0bd19b9b02df271413ee76bf7e13bd" Feb 25 17:00:44 crc kubenswrapper[4937]: I0225 17:00:44.509471 4937 scope.go:117] "RemoveContainer" containerID="96156e3c2ea3d1cca434e94132995ab19cb05f3e731740baafbdaed496f5ca16" Feb 25 17:00:47 crc kubenswrapper[4937]: I0225 17:00:47.368079 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:00:47 crc kubenswrapper[4937]: E0225 17:00:47.369019 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:00:50 crc kubenswrapper[4937]: I0225 17:00:50.902080 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:50 crc kubenswrapper[4937]: I0225 17:00:50.960401 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:51 crc kubenswrapper[4937]: I0225 17:00:51.145309 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mlwk"] Feb 25 17:00:52 crc kubenswrapper[4937]: I0225 17:00:52.269136 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4mlwk" podUID="cefa14cc-f644-42d0-bdfa-922cb340d691" containerName="registry-server" containerID="cri-o://eb2ef53fd85d395e28aaa84816d95c1c1562875b64b4778d1115bbc31ebffe79" gracePeriod=2 Feb 25 17:00:53 crc kubenswrapper[4937]: I0225 17:00:53.279118 4937 generic.go:334] "Generic (PLEG): container finished" podID="cefa14cc-f644-42d0-bdfa-922cb340d691" containerID="eb2ef53fd85d395e28aaa84816d95c1c1562875b64b4778d1115bbc31ebffe79" exitCode=0 Feb 25 17:00:53 crc kubenswrapper[4937]: I0225 17:00:53.279199 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mlwk" event={"ID":"cefa14cc-f644-42d0-bdfa-922cb340d691","Type":"ContainerDied","Data":"eb2ef53fd85d395e28aaa84816d95c1c1562875b64b4778d1115bbc31ebffe79"} Feb 25 17:00:53 crc kubenswrapper[4937]: I0225 17:00:53.561972 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:53 crc kubenswrapper[4937]: I0225 17:00:53.613726 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefa14cc-f644-42d0-bdfa-922cb340d691-utilities\") pod \"cefa14cc-f644-42d0-bdfa-922cb340d691\" (UID: \"cefa14cc-f644-42d0-bdfa-922cb340d691\") " Feb 25 17:00:53 crc kubenswrapper[4937]: I0225 17:00:53.613896 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb8tj\" (UniqueName: \"kubernetes.io/projected/cefa14cc-f644-42d0-bdfa-922cb340d691-kube-api-access-tb8tj\") pod \"cefa14cc-f644-42d0-bdfa-922cb340d691\" (UID: \"cefa14cc-f644-42d0-bdfa-922cb340d691\") " Feb 25 17:00:53 crc kubenswrapper[4937]: I0225 17:00:53.614068 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefa14cc-f644-42d0-bdfa-922cb340d691-catalog-content\") pod \"cefa14cc-f644-42d0-bdfa-922cb340d691\" (UID: \"cefa14cc-f644-42d0-bdfa-922cb340d691\") " Feb 25 17:00:53 crc kubenswrapper[4937]: I0225 17:00:53.614753 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cefa14cc-f644-42d0-bdfa-922cb340d691-utilities" (OuterVolumeSpecName: "utilities") pod "cefa14cc-f644-42d0-bdfa-922cb340d691" (UID: "cefa14cc-f644-42d0-bdfa-922cb340d691"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 17:00:53 crc kubenswrapper[4937]: I0225 17:00:53.640082 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cefa14cc-f644-42d0-bdfa-922cb340d691-kube-api-access-tb8tj" (OuterVolumeSpecName: "kube-api-access-tb8tj") pod "cefa14cc-f644-42d0-bdfa-922cb340d691" (UID: "cefa14cc-f644-42d0-bdfa-922cb340d691"). InnerVolumeSpecName "kube-api-access-tb8tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 17:00:53 crc kubenswrapper[4937]: I0225 17:00:53.659623 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cefa14cc-f644-42d0-bdfa-922cb340d691-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cefa14cc-f644-42d0-bdfa-922cb340d691" (UID: "cefa14cc-f644-42d0-bdfa-922cb340d691"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 17:00:53 crc kubenswrapper[4937]: I0225 17:00:53.716666 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cefa14cc-f644-42d0-bdfa-922cb340d691-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 17:00:53 crc kubenswrapper[4937]: I0225 17:00:53.716912 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cefa14cc-f644-42d0-bdfa-922cb340d691-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 17:00:53 crc kubenswrapper[4937]: I0225 17:00:53.716978 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb8tj\" (UniqueName: \"kubernetes.io/projected/cefa14cc-f644-42d0-bdfa-922cb340d691-kube-api-access-tb8tj\") on node \"crc\" DevicePath \"\"" Feb 25 17:00:54 crc kubenswrapper[4937]: I0225 17:00:54.290368 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mlwk" event={"ID":"cefa14cc-f644-42d0-bdfa-922cb340d691","Type":"ContainerDied","Data":"0860e71c4ae80c419d1eb0045b7471398106acc935509217de1337b044dc96cf"} Feb 25 17:00:54 crc kubenswrapper[4937]: I0225 17:00:54.290436 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mlwk" Feb 25 17:00:54 crc kubenswrapper[4937]: I0225 17:00:54.290624 4937 scope.go:117] "RemoveContainer" containerID="eb2ef53fd85d395e28aaa84816d95c1c1562875b64b4778d1115bbc31ebffe79" Feb 25 17:00:54 crc kubenswrapper[4937]: I0225 17:00:54.326688 4937 scope.go:117] "RemoveContainer" containerID="9fcea761e272f622deaf58886cca73e75b6dcd40ada5ae60fa3427b7eb80dacf" Feb 25 17:00:54 crc kubenswrapper[4937]: I0225 17:00:54.333598 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mlwk"] Feb 25 17:00:54 crc kubenswrapper[4937]: I0225 17:00:54.354375 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mlwk"] Feb 25 17:00:54 crc kubenswrapper[4937]: I0225 17:00:54.389940 4937 scope.go:117] "RemoveContainer" containerID="f6bcb9658f8e28518482a227a0133178c1b92fa3df5541ff94fa8868f90a9271" Feb 25 17:00:55 crc kubenswrapper[4937]: I0225 17:00:55.385239 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cefa14cc-f644-42d0-bdfa-922cb340d691" path="/var/lib/kubelet/pods/cefa14cc-f644-42d0-bdfa-922cb340d691/volumes" Feb 25 17:00:58 crc kubenswrapper[4937]: I0225 17:00:58.367998 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:00:58 crc kubenswrapper[4937]: E0225 17:00:58.368737 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.171921 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29533981-4lxr5"] Feb 25 17:01:00 crc kubenswrapper[4937]: E0225 17:01:00.173080 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefa14cc-f644-42d0-bdfa-922cb340d691" containerName="registry-server" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.173108 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefa14cc-f644-42d0-bdfa-922cb340d691" containerName="registry-server" Feb 25 17:01:00 crc kubenswrapper[4937]: E0225 17:01:00.173189 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefa14cc-f644-42d0-bdfa-922cb340d691" containerName="extract-utilities" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.173203 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefa14cc-f644-42d0-bdfa-922cb340d691" containerName="extract-utilities" Feb 25 17:01:00 crc kubenswrapper[4937]: E0225 17:01:00.173221 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefa14cc-f644-42d0-bdfa-922cb340d691" containerName="extract-content" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.173236 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefa14cc-f644-42d0-bdfa-922cb340d691" containerName="extract-content" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.173788 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="cefa14cc-f644-42d0-bdfa-922cb340d691" containerName="registry-server" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.175619 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.186836 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29533981-4lxr5"] Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.263897 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9wh2\" (UniqueName: \"kubernetes.io/projected/c7a7261c-41bd-407b-b8a3-3a294abe82ed-kube-api-access-j9wh2\") pod \"keystone-cron-29533981-4lxr5\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.263993 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-fernet-keys\") pod \"keystone-cron-29533981-4lxr5\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.264042 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-combined-ca-bundle\") pod \"keystone-cron-29533981-4lxr5\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.264149 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-config-data\") pod \"keystone-cron-29533981-4lxr5\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.366092 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-combined-ca-bundle\") pod \"keystone-cron-29533981-4lxr5\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.366147 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-config-data\") pod \"keystone-cron-29533981-4lxr5\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.366285 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9wh2\" (UniqueName: \"kubernetes.io/projected/c7a7261c-41bd-407b-b8a3-3a294abe82ed-kube-api-access-j9wh2\") pod \"keystone-cron-29533981-4lxr5\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.366322 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-fernet-keys\") pod \"keystone-cron-29533981-4lxr5\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.372360 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-config-data\") pod \"keystone-cron-29533981-4lxr5\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.376325 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-fernet-keys\") pod \"keystone-cron-29533981-4lxr5\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.380817 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-combined-ca-bundle\") pod \"keystone-cron-29533981-4lxr5\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.403000 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9wh2\" (UniqueName: \"kubernetes.io/projected/c7a7261c-41bd-407b-b8a3-3a294abe82ed-kube-api-access-j9wh2\") pod \"keystone-cron-29533981-4lxr5\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.504214 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:00 crc kubenswrapper[4937]: I0225 17:01:00.992953 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29533981-4lxr5"] Feb 25 17:01:01 crc kubenswrapper[4937]: I0225 17:01:01.393306 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29533981-4lxr5" event={"ID":"c7a7261c-41bd-407b-b8a3-3a294abe82ed","Type":"ContainerStarted","Data":"06d8601d316dc466eef35ba1bab363dd70be9dfee866960751ed2d470264de03"} Feb 25 17:01:01 crc kubenswrapper[4937]: I0225 17:01:01.393767 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29533981-4lxr5" event={"ID":"c7a7261c-41bd-407b-b8a3-3a294abe82ed","Type":"ContainerStarted","Data":"2108d60c28132c3dd8d76a518061a4245e02fbe73b9f17d59e9b992981855204"} Feb 25 17:01:01 crc kubenswrapper[4937]: I0225 17:01:01.424624 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29533981-4lxr5" podStartSLOduration=1.4245978670000001 podStartE2EDuration="1.424597867s" podCreationTimestamp="2026-02-25 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 17:01:01.420553066 +0000 UTC m=+4512.433944976" watchObservedRunningTime="2026-02-25 17:01:01.424597867 +0000 UTC m=+4512.437989757" Feb 25 17:01:04 crc kubenswrapper[4937]: I0225 17:01:04.431102 4937 generic.go:334] "Generic (PLEG): container finished" podID="c7a7261c-41bd-407b-b8a3-3a294abe82ed" containerID="06d8601d316dc466eef35ba1bab363dd70be9dfee866960751ed2d470264de03" exitCode=0 Feb 25 17:01:04 crc kubenswrapper[4937]: I0225 17:01:04.431837 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29533981-4lxr5" event={"ID":"c7a7261c-41bd-407b-b8a3-3a294abe82ed","Type":"ContainerDied","Data":"06d8601d316dc466eef35ba1bab363dd70be9dfee866960751ed2d470264de03"} Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.135409 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.188402 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-fernet-keys\") pod \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.188862 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9wh2\" (UniqueName: \"kubernetes.io/projected/c7a7261c-41bd-407b-b8a3-3a294abe82ed-kube-api-access-j9wh2\") pod \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.189516 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-combined-ca-bundle\") pod \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.189810 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-config-data\") pod \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\" (UID: \"c7a7261c-41bd-407b-b8a3-3a294abe82ed\") " Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.194052 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c7a7261c-41bd-407b-b8a3-3a294abe82ed" (UID: "c7a7261c-41bd-407b-b8a3-3a294abe82ed"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.194339 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a7261c-41bd-407b-b8a3-3a294abe82ed-kube-api-access-j9wh2" (OuterVolumeSpecName: "kube-api-access-j9wh2") pod "c7a7261c-41bd-407b-b8a3-3a294abe82ed" (UID: "c7a7261c-41bd-407b-b8a3-3a294abe82ed"). InnerVolumeSpecName "kube-api-access-j9wh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.220377 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7a7261c-41bd-407b-b8a3-3a294abe82ed" (UID: "c7a7261c-41bd-407b-b8a3-3a294abe82ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.247676 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-config-data" (OuterVolumeSpecName: "config-data") pod "c7a7261c-41bd-407b-b8a3-3a294abe82ed" (UID: "c7a7261c-41bd-407b-b8a3-3a294abe82ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.291615 4937 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.291645 4937 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.291655 4937 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7a7261c-41bd-407b-b8a3-3a294abe82ed-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.291666 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9wh2\" (UniqueName: \"kubernetes.io/projected/c7a7261c-41bd-407b-b8a3-3a294abe82ed-kube-api-access-j9wh2\") on node \"crc\" DevicePath \"\"" Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.455271 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29533981-4lxr5" event={"ID":"c7a7261c-41bd-407b-b8a3-3a294abe82ed","Type":"ContainerDied","Data":"2108d60c28132c3dd8d76a518061a4245e02fbe73b9f17d59e9b992981855204"} Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.455313 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2108d60c28132c3dd8d76a518061a4245e02fbe73b9f17d59e9b992981855204" Feb 25 17:01:06 crc kubenswrapper[4937]: I0225 17:01:06.455331 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29533981-4lxr5" Feb 25 17:01:13 crc kubenswrapper[4937]: I0225 17:01:13.368538 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:01:13 crc kubenswrapper[4937]: E0225 17:01:13.369809 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:01:28 crc kubenswrapper[4937]: I0225 17:01:28.368072 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:01:28 crc kubenswrapper[4937]: E0225 17:01:28.369044 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:01:41 crc kubenswrapper[4937]: I0225 17:01:41.381733 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:01:41 crc kubenswrapper[4937]: E0225 17:01:41.383010 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:01:54 crc kubenswrapper[4937]: I0225 17:01:54.367418 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:01:54 crc kubenswrapper[4937]: E0225 17:01:54.368254 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:02:00 crc kubenswrapper[4937]: I0225 17:02:00.164507 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533982-gnmxb"] Feb 25 17:02:00 crc kubenswrapper[4937]: E0225 17:02:00.165644 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a7261c-41bd-407b-b8a3-3a294abe82ed" containerName="keystone-cron" Feb 25 17:02:00 crc kubenswrapper[4937]: I0225 17:02:00.165664 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a7261c-41bd-407b-b8a3-3a294abe82ed" containerName="keystone-cron" Feb 25 17:02:00 crc kubenswrapper[4937]: I0225 17:02:00.165911 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a7261c-41bd-407b-b8a3-3a294abe82ed" containerName="keystone-cron" Feb 25 17:02:00 crc kubenswrapper[4937]: I0225 17:02:00.166850 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533982-gnmxb" Feb 25 17:02:00 crc kubenswrapper[4937]: I0225 17:02:00.173222 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 17:02:00 crc kubenswrapper[4937]: I0225 17:02:00.173431 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 17:02:00 crc kubenswrapper[4937]: I0225 17:02:00.175234 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 17:02:00 crc kubenswrapper[4937]: I0225 17:02:00.179777 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533982-gnmxb"] Feb 25 17:02:00 crc kubenswrapper[4937]: I0225 17:02:00.241352 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9gr8\" (UniqueName: \"kubernetes.io/projected/679c06c3-573a-4c6a-8ad6-48e423529023-kube-api-access-t9gr8\") pod \"auto-csr-approver-29533982-gnmxb\" (UID: \"679c06c3-573a-4c6a-8ad6-48e423529023\") " pod="openshift-infra/auto-csr-approver-29533982-gnmxb" Feb 25 17:02:00 crc kubenswrapper[4937]: I0225 17:02:00.344426 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9gr8\" (UniqueName: \"kubernetes.io/projected/679c06c3-573a-4c6a-8ad6-48e423529023-kube-api-access-t9gr8\") pod \"auto-csr-approver-29533982-gnmxb\" (UID: \"679c06c3-573a-4c6a-8ad6-48e423529023\") " pod="openshift-infra/auto-csr-approver-29533982-gnmxb" Feb 25 17:02:00 crc kubenswrapper[4937]: I0225 17:02:00.374263 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9gr8\" (UniqueName: \"kubernetes.io/projected/679c06c3-573a-4c6a-8ad6-48e423529023-kube-api-access-t9gr8\") pod \"auto-csr-approver-29533982-gnmxb\" (UID: \"679c06c3-573a-4c6a-8ad6-48e423529023\") " pod="openshift-infra/auto-csr-approver-29533982-gnmxb" Feb 25 17:02:00 crc kubenswrapper[4937]: I0225 17:02:00.504051 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533982-gnmxb" Feb 25 17:02:01 crc kubenswrapper[4937]: W0225 17:02:01.009041 4937 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod679c06c3_573a_4c6a_8ad6_48e423529023.slice/crio-f50298d948158c4e1f153abbc685596260b652bc84bf4ae0e5d28805839c12b6 WatchSource:0}: Error finding container f50298d948158c4e1f153abbc685596260b652bc84bf4ae0e5d28805839c12b6: Status 404 returned error can't find the container with id f50298d948158c4e1f153abbc685596260b652bc84bf4ae0e5d28805839c12b6 Feb 25 17:02:01 crc kubenswrapper[4937]: I0225 17:02:01.011515 4937 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 17:02:01 crc kubenswrapper[4937]: I0225 17:02:01.015253 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533982-gnmxb"] Feb 25 17:02:01 crc kubenswrapper[4937]: I0225 17:02:01.057539 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533982-gnmxb" event={"ID":"679c06c3-573a-4c6a-8ad6-48e423529023","Type":"ContainerStarted","Data":"f50298d948158c4e1f153abbc685596260b652bc84bf4ae0e5d28805839c12b6"} Feb 25 17:02:03 crc kubenswrapper[4937]: I0225 17:02:03.081131 4937 generic.go:334] "Generic (PLEG): container finished" podID="679c06c3-573a-4c6a-8ad6-48e423529023" containerID="9c2599f159e4b9b6bc9e1e5ea1f7bbf4dd7775836ea9074537d07f925d856b1e" exitCode=0 Feb 25 17:02:03 crc kubenswrapper[4937]: I0225 17:02:03.081191 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533982-gnmxb" event={"ID":"679c06c3-573a-4c6a-8ad6-48e423529023","Type":"ContainerDied","Data":"9c2599f159e4b9b6bc9e1e5ea1f7bbf4dd7775836ea9074537d07f925d856b1e"} Feb 25 17:02:04 crc kubenswrapper[4937]: I0225 17:02:04.727334 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533982-gnmxb" Feb 25 17:02:04 crc kubenswrapper[4937]: I0225 17:02:04.851409 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9gr8\" (UniqueName: \"kubernetes.io/projected/679c06c3-573a-4c6a-8ad6-48e423529023-kube-api-access-t9gr8\") pod \"679c06c3-573a-4c6a-8ad6-48e423529023\" (UID: \"679c06c3-573a-4c6a-8ad6-48e423529023\") " Feb 25 17:02:04 crc kubenswrapper[4937]: I0225 17:02:04.858871 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/679c06c3-573a-4c6a-8ad6-48e423529023-kube-api-access-t9gr8" (OuterVolumeSpecName: "kube-api-access-t9gr8") pod "679c06c3-573a-4c6a-8ad6-48e423529023" (UID: "679c06c3-573a-4c6a-8ad6-48e423529023"). InnerVolumeSpecName "kube-api-access-t9gr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 17:02:04 crc kubenswrapper[4937]: I0225 17:02:04.954293 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9gr8\" (UniqueName: \"kubernetes.io/projected/679c06c3-573a-4c6a-8ad6-48e423529023-kube-api-access-t9gr8\") on node \"crc\" DevicePath \"\"" Feb 25 17:02:05 crc kubenswrapper[4937]: I0225 17:02:05.134905 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533982-gnmxb" event={"ID":"679c06c3-573a-4c6a-8ad6-48e423529023","Type":"ContainerDied","Data":"f50298d948158c4e1f153abbc685596260b652bc84bf4ae0e5d28805839c12b6"} Feb 25 17:02:05 crc kubenswrapper[4937]: I0225 17:02:05.134983 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533982-gnmxb" Feb 25 17:02:05 crc kubenswrapper[4937]: I0225 17:02:05.134999 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f50298d948158c4e1f153abbc685596260b652bc84bf4ae0e5d28805839c12b6" Feb 25 17:02:05 crc kubenswrapper[4937]: I0225 17:02:05.368206 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:02:05 crc kubenswrapper[4937]: E0225 17:02:05.368721 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:02:05 crc kubenswrapper[4937]: I0225 17:02:05.798218 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533976-9dkfb"] Feb 25 17:02:05 crc kubenswrapper[4937]: I0225 17:02:05.808635 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533976-9dkfb"] Feb 25 17:02:07 crc kubenswrapper[4937]: I0225 17:02:07.385804 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e723a3-7bd5-483a-8678-02a8df3a405b" path="/var/lib/kubelet/pods/50e723a3-7bd5-483a-8678-02a8df3a405b/volumes" Feb 25 17:02:18 crc kubenswrapper[4937]: I0225 17:02:18.270177 4937 generic.go:334] "Generic (PLEG): container finished" podID="db789b16-3221-4d7f-a3ac-10a2b3169ad5" containerID="9a3b00aaa1d1a76a1dded6ad57e9c8e0b2c2c4672bd73e374ee55fb5c9c643b0" exitCode=0 Feb 25 17:02:18 crc kubenswrapper[4937]: I0225 17:02:18.270276 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bs75f/must-gather-6bkg2" event={"ID":"db789b16-3221-4d7f-a3ac-10a2b3169ad5","Type":"ContainerDied","Data":"9a3b00aaa1d1a76a1dded6ad57e9c8e0b2c2c4672bd73e374ee55fb5c9c643b0"} Feb 25 17:02:18 crc kubenswrapper[4937]: I0225 17:02:18.271216 4937 scope.go:117] "RemoveContainer" containerID="9a3b00aaa1d1a76a1dded6ad57e9c8e0b2c2c4672bd73e374ee55fb5c9c643b0" Feb 25 17:02:18 crc kubenswrapper[4937]: I0225 17:02:18.368007 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:02:18 crc kubenswrapper[4937]: E0225 17:02:18.368243 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:02:19 crc kubenswrapper[4937]: I0225 17:02:19.179251 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bs75f_must-gather-6bkg2_db789b16-3221-4d7f-a3ac-10a2b3169ad5/gather/0.log" Feb 25 17:02:32 crc kubenswrapper[4937]: I0225 17:02:32.368022 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:02:32 crc kubenswrapper[4937]: E0225 17:02:32.368856 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:02:32 crc kubenswrapper[4937]: I0225 17:02:32.515706 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bs75f/must-gather-6bkg2"] Feb 25 17:02:32 crc kubenswrapper[4937]: I0225 17:02:32.516601 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-bs75f/must-gather-6bkg2" podUID="db789b16-3221-4d7f-a3ac-10a2b3169ad5" containerName="copy" containerID="cri-o://ed4c3ada1d83a22ab98d9fb4490da1984b3d1421a1537db9aa87408f6d218b5f" gracePeriod=2 Feb 25 17:02:32 crc kubenswrapper[4937]: I0225 17:02:32.543740 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bs75f/must-gather-6bkg2"] Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.189540 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bs75f_must-gather-6bkg2_db789b16-3221-4d7f-a3ac-10a2b3169ad5/copy/0.log" Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.190201 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/must-gather-6bkg2" Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.255808 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db789b16-3221-4d7f-a3ac-10a2b3169ad5-must-gather-output\") pod \"db789b16-3221-4d7f-a3ac-10a2b3169ad5\" (UID: \"db789b16-3221-4d7f-a3ac-10a2b3169ad5\") " Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.255881 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tplz7\" (UniqueName: \"kubernetes.io/projected/db789b16-3221-4d7f-a3ac-10a2b3169ad5-kube-api-access-tplz7\") pod \"db789b16-3221-4d7f-a3ac-10a2b3169ad5\" (UID: \"db789b16-3221-4d7f-a3ac-10a2b3169ad5\") " Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.263682 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db789b16-3221-4d7f-a3ac-10a2b3169ad5-kube-api-access-tplz7" (OuterVolumeSpecName: "kube-api-access-tplz7") pod "db789b16-3221-4d7f-a3ac-10a2b3169ad5" (UID: "db789b16-3221-4d7f-a3ac-10a2b3169ad5"). InnerVolumeSpecName "kube-api-access-tplz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.358499 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tplz7\" (UniqueName: \"kubernetes.io/projected/db789b16-3221-4d7f-a3ac-10a2b3169ad5-kube-api-access-tplz7\") on node \"crc\" DevicePath \"\"" Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.432067 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db789b16-3221-4d7f-a3ac-10a2b3169ad5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "db789b16-3221-4d7f-a3ac-10a2b3169ad5" (UID: "db789b16-3221-4d7f-a3ac-10a2b3169ad5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.443438 4937 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bs75f_must-gather-6bkg2_db789b16-3221-4d7f-a3ac-10a2b3169ad5/copy/0.log" Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.443763 4937 generic.go:334] "Generic (PLEG): container finished" podID="db789b16-3221-4d7f-a3ac-10a2b3169ad5" containerID="ed4c3ada1d83a22ab98d9fb4490da1984b3d1421a1537db9aa87408f6d218b5f" exitCode=143 Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.443813 4937 scope.go:117] "RemoveContainer" containerID="ed4c3ada1d83a22ab98d9fb4490da1984b3d1421a1537db9aa87408f6d218b5f" Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.443932 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bs75f/must-gather-6bkg2" Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.460373 4937 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db789b16-3221-4d7f-a3ac-10a2b3169ad5-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.483775 4937 scope.go:117] "RemoveContainer" containerID="9a3b00aaa1d1a76a1dded6ad57e9c8e0b2c2c4672bd73e374ee55fb5c9c643b0" Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.521478 4937 scope.go:117] "RemoveContainer" containerID="ed4c3ada1d83a22ab98d9fb4490da1984b3d1421a1537db9aa87408f6d218b5f" Feb 25 17:02:33 crc kubenswrapper[4937]: E0225 17:02:33.521996 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed4c3ada1d83a22ab98d9fb4490da1984b3d1421a1537db9aa87408f6d218b5f\": container with ID starting with ed4c3ada1d83a22ab98d9fb4490da1984b3d1421a1537db9aa87408f6d218b5f not found: ID does not exist" containerID="ed4c3ada1d83a22ab98d9fb4490da1984b3d1421a1537db9aa87408f6d218b5f" Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.522033 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4c3ada1d83a22ab98d9fb4490da1984b3d1421a1537db9aa87408f6d218b5f"} err="failed to get container status \"ed4c3ada1d83a22ab98d9fb4490da1984b3d1421a1537db9aa87408f6d218b5f\": rpc error: code = NotFound desc = could not find container \"ed4c3ada1d83a22ab98d9fb4490da1984b3d1421a1537db9aa87408f6d218b5f\": container with ID starting with ed4c3ada1d83a22ab98d9fb4490da1984b3d1421a1537db9aa87408f6d218b5f not found: ID does not exist" Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.522060 4937 scope.go:117] "RemoveContainer" containerID="9a3b00aaa1d1a76a1dded6ad57e9c8e0b2c2c4672bd73e374ee55fb5c9c643b0" Feb 25 17:02:33 crc kubenswrapper[4937]: E0225 17:02:33.522514 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a3b00aaa1d1a76a1dded6ad57e9c8e0b2c2c4672bd73e374ee55fb5c9c643b0\": container with ID starting with 9a3b00aaa1d1a76a1dded6ad57e9c8e0b2c2c4672bd73e374ee55fb5c9c643b0 not found: ID does not exist" containerID="9a3b00aaa1d1a76a1dded6ad57e9c8e0b2c2c4672bd73e374ee55fb5c9c643b0" Feb 25 17:02:33 crc kubenswrapper[4937]: I0225 17:02:33.522540 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3b00aaa1d1a76a1dded6ad57e9c8e0b2c2c4672bd73e374ee55fb5c9c643b0"} err="failed to get container status \"9a3b00aaa1d1a76a1dded6ad57e9c8e0b2c2c4672bd73e374ee55fb5c9c643b0\": rpc error: code = NotFound desc = could not find container \"9a3b00aaa1d1a76a1dded6ad57e9c8e0b2c2c4672bd73e374ee55fb5c9c643b0\": container with ID starting with 9a3b00aaa1d1a76a1dded6ad57e9c8e0b2c2c4672bd73e374ee55fb5c9c643b0 not found: ID does not exist" Feb 25 17:02:35 crc kubenswrapper[4937]: I0225 17:02:35.379920 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db789b16-3221-4d7f-a3ac-10a2b3169ad5" path="/var/lib/kubelet/pods/db789b16-3221-4d7f-a3ac-10a2b3169ad5/volumes" Feb 25 17:02:44 crc kubenswrapper[4937]: I0225 17:02:44.658496 4937 scope.go:117] "RemoveContainer" containerID="0bf340e507d0b47f56a4b858fc4de408aacf6bd50c6d07c793853b8ca29be45e" Feb 25 17:02:45 crc kubenswrapper[4937]: I0225 17:02:45.369431 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:02:45 crc kubenswrapper[4937]: E0225 17:02:45.369754 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:02:59 crc kubenswrapper[4937]: I0225 17:02:59.367821 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:02:59 crc kubenswrapper[4937]: E0225 17:02:59.368630 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:03:11 crc kubenswrapper[4937]: I0225 17:03:11.380662 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:03:11 crc kubenswrapper[4937]: E0225 17:03:11.381723 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:03:24 crc kubenswrapper[4937]: I0225 17:03:24.367394 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:03:24 crc kubenswrapper[4937]: E0225 17:03:24.368090 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:03:35 crc kubenswrapper[4937]: I0225 17:03:35.367804 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:03:35 crc kubenswrapper[4937]: E0225 17:03:35.368539 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:03:49 crc kubenswrapper[4937]: I0225 17:03:49.372314 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:03:49 crc kubenswrapper[4937]: E0225 17:03:49.373092 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.172009 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533984-lsksg"] Feb 25 17:04:00 crc kubenswrapper[4937]: E0225 17:04:00.173954 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db789b16-3221-4d7f-a3ac-10a2b3169ad5" containerName="copy" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.173989 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="db789b16-3221-4d7f-a3ac-10a2b3169ad5" containerName="copy" Feb 25 17:04:00 crc kubenswrapper[4937]: E0225 17:04:00.174038 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db789b16-3221-4d7f-a3ac-10a2b3169ad5" containerName="gather" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.174055 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="db789b16-3221-4d7f-a3ac-10a2b3169ad5" containerName="gather" Feb 25 17:04:00 crc kubenswrapper[4937]: E0225 17:04:00.174078 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="679c06c3-573a-4c6a-8ad6-48e423529023" containerName="oc" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.174095 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="679c06c3-573a-4c6a-8ad6-48e423529023" containerName="oc" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.174662 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="db789b16-3221-4d7f-a3ac-10a2b3169ad5" containerName="copy" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.174699 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="679c06c3-573a-4c6a-8ad6-48e423529023" containerName="oc" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.174743 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="db789b16-3221-4d7f-a3ac-10a2b3169ad5" containerName="gather" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.176531 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533984-lsksg" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.180196 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.180730 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.181357 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.189852 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533984-lsksg"] Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.329818 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h9hf\" (UniqueName: \"kubernetes.io/projected/4a24e603-6b50-4ef4-b07e-f184fafcfd77-kube-api-access-5h9hf\") pod \"auto-csr-approver-29533984-lsksg\" (UID: \"4a24e603-6b50-4ef4-b07e-f184fafcfd77\") " pod="openshift-infra/auto-csr-approver-29533984-lsksg" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.432439 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h9hf\" (UniqueName: \"kubernetes.io/projected/4a24e603-6b50-4ef4-b07e-f184fafcfd77-kube-api-access-5h9hf\") pod \"auto-csr-approver-29533984-lsksg\" (UID: \"4a24e603-6b50-4ef4-b07e-f184fafcfd77\") " pod="openshift-infra/auto-csr-approver-29533984-lsksg" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.453217 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h9hf\" (UniqueName: \"kubernetes.io/projected/4a24e603-6b50-4ef4-b07e-f184fafcfd77-kube-api-access-5h9hf\") pod \"auto-csr-approver-29533984-lsksg\" (UID: \"4a24e603-6b50-4ef4-b07e-f184fafcfd77\") " pod="openshift-infra/auto-csr-approver-29533984-lsksg" Feb 25 17:04:00 crc kubenswrapper[4937]: I0225 17:04:00.521062 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533984-lsksg" Feb 25 17:04:01 crc kubenswrapper[4937]: I0225 17:04:01.041279 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533984-lsksg"] Feb 25 17:04:01 crc kubenswrapper[4937]: I0225 17:04:01.346029 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533984-lsksg" event={"ID":"4a24e603-6b50-4ef4-b07e-f184fafcfd77","Type":"ContainerStarted","Data":"4f35d6f177ccfca7c76eb64d06a8b5f316197fbb6190dc268df27582fdea73b8"} Feb 25 17:04:03 crc kubenswrapper[4937]: I0225 17:04:03.390851 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533984-lsksg" event={"ID":"4a24e603-6b50-4ef4-b07e-f184fafcfd77","Type":"ContainerStarted","Data":"b234c62d34f80dfee29f25a042001c8d6d69fccc04bbbce990a1cf2725afe569"} Feb 25 17:04:03 crc kubenswrapper[4937]: I0225 17:04:03.401459 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533984-lsksg" podStartSLOduration=1.457822172 podStartE2EDuration="3.401432544s" podCreationTimestamp="2026-02-25 17:04:00 +0000 UTC" firstStartedPulling="2026-02-25 17:04:01.043408183 +0000 UTC m=+4692.056800063" lastFinishedPulling="2026-02-25 17:04:02.987018525 +0000 UTC m=+4694.000410435" observedRunningTime="2026-02-25 17:04:03.397706491 +0000 UTC m=+4694.411098381" watchObservedRunningTime="2026-02-25 17:04:03.401432544 +0000 UTC m=+4694.414824474" Feb 25 17:04:04 crc kubenswrapper[4937]: I0225 17:04:04.368082 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:04:04 crc kubenswrapper[4937]: E0225 17:04:04.368851 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:04:04 crc kubenswrapper[4937]: I0225 17:04:04.385552 4937 generic.go:334] "Generic (PLEG): container finished" podID="4a24e603-6b50-4ef4-b07e-f184fafcfd77" containerID="b234c62d34f80dfee29f25a042001c8d6d69fccc04bbbce990a1cf2725afe569" exitCode=0 Feb 25 17:04:04 crc kubenswrapper[4937]: I0225 17:04:04.385615 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533984-lsksg" event={"ID":"4a24e603-6b50-4ef4-b07e-f184fafcfd77","Type":"ContainerDied","Data":"b234c62d34f80dfee29f25a042001c8d6d69fccc04bbbce990a1cf2725afe569"} Feb 25 17:04:06 crc kubenswrapper[4937]: I0225 17:04:06.328797 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533984-lsksg" Feb 25 17:04:06 crc kubenswrapper[4937]: I0225 17:04:06.404780 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533984-lsksg" event={"ID":"4a24e603-6b50-4ef4-b07e-f184fafcfd77","Type":"ContainerDied","Data":"4f35d6f177ccfca7c76eb64d06a8b5f316197fbb6190dc268df27582fdea73b8"} Feb 25 17:04:06 crc kubenswrapper[4937]: I0225 17:04:06.404815 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f35d6f177ccfca7c76eb64d06a8b5f316197fbb6190dc268df27582fdea73b8" Feb 25 17:04:06 crc kubenswrapper[4937]: I0225 17:04:06.404841 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533984-lsksg" Feb 25 17:04:06 crc kubenswrapper[4937]: I0225 17:04:06.441262 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h9hf\" (UniqueName: \"kubernetes.io/projected/4a24e603-6b50-4ef4-b07e-f184fafcfd77-kube-api-access-5h9hf\") pod \"4a24e603-6b50-4ef4-b07e-f184fafcfd77\" (UID: \"4a24e603-6b50-4ef4-b07e-f184fafcfd77\") " Feb 25 17:04:06 crc kubenswrapper[4937]: I0225 17:04:06.451935 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a24e603-6b50-4ef4-b07e-f184fafcfd77-kube-api-access-5h9hf" (OuterVolumeSpecName: "kube-api-access-5h9hf") pod "4a24e603-6b50-4ef4-b07e-f184fafcfd77" (UID: "4a24e603-6b50-4ef4-b07e-f184fafcfd77"). InnerVolumeSpecName "kube-api-access-5h9hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 17:04:06 crc kubenswrapper[4937]: I0225 17:04:06.473625 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533978-7vkz5"] Feb 25 17:04:06 crc kubenswrapper[4937]: I0225 17:04:06.485588 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533978-7vkz5"] Feb 25 17:04:06 crc kubenswrapper[4937]: I0225 17:04:06.546207 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h9hf\" (UniqueName: \"kubernetes.io/projected/4a24e603-6b50-4ef4-b07e-f184fafcfd77-kube-api-access-5h9hf\") on node \"crc\" DevicePath \"\"" Feb 25 17:04:07 crc kubenswrapper[4937]: I0225 17:04:07.379867 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37aa378c-2e15-4213-9554-d54aab5803da" path="/var/lib/kubelet/pods/37aa378c-2e15-4213-9554-d54aab5803da/volumes" Feb 25 17:04:17 crc kubenswrapper[4937]: I0225 17:04:17.367996 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:04:17 crc kubenswrapper[4937]: E0225 17:04:17.368880 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:04:31 crc kubenswrapper[4937]: I0225 17:04:31.374590 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:04:31 crc kubenswrapper[4937]: E0225 17:04:31.375411 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:04:44 crc kubenswrapper[4937]: I0225 17:04:44.367626 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:04:44 crc kubenswrapper[4937]: E0225 17:04:44.368418 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:04:44 crc kubenswrapper[4937]: I0225 17:04:44.774990 4937 scope.go:117] "RemoveContainer" containerID="5e8b62139f79705c5364ccedc10bddd3925ed9baab4662f7ce5641fd2a2be0fe" Feb 25 17:04:57 crc kubenswrapper[4937]: I0225 17:04:57.367730 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:04:57 crc kubenswrapper[4937]: E0225 17:04:57.370427 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:05:09 crc kubenswrapper[4937]: I0225 17:05:09.368044 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:05:09 crc kubenswrapper[4937]: E0225 17:05:09.368824 4937 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2r4xd_openshift-machine-config-operator(8f826096-fb93-42fe-a779-9afe1d36f2d4)\"" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" podUID="8f826096-fb93-42fe-a779-9afe1d36f2d4" Feb 25 17:05:22 crc kubenswrapper[4937]: I0225 17:05:22.367299 4937 scope.go:117] "RemoveContainer" containerID="9e26c4d045988f51adbf1d4cb20c5e50951b5993f6f35fb75a1d3d689dfe3c30" Feb 25 17:05:24 crc kubenswrapper[4937]: I0225 17:05:24.226898 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2r4xd" event={"ID":"8f826096-fb93-42fe-a779-9afe1d36f2d4","Type":"ContainerStarted","Data":"1efa5cc254fc48678ef3535197ec81c924c3dcebd8628c178e2716a6e6812704"} Feb 25 17:06:00 crc kubenswrapper[4937]: I0225 17:06:00.169735 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533986-7gq5l"] Feb 25 17:06:00 crc kubenswrapper[4937]: E0225 17:06:00.170853 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a24e603-6b50-4ef4-b07e-f184fafcfd77" containerName="oc" Feb 25 17:06:00 crc kubenswrapper[4937]: I0225 17:06:00.170871 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a24e603-6b50-4ef4-b07e-f184fafcfd77" containerName="oc" Feb 25 17:06:00 crc kubenswrapper[4937]: I0225 17:06:00.171164 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a24e603-6b50-4ef4-b07e-f184fafcfd77" containerName="oc" Feb 25 17:06:00 crc kubenswrapper[4937]: I0225 17:06:00.172064 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533986-7gq5l" Feb 25 17:06:00 crc kubenswrapper[4937]: I0225 17:06:00.176796 4937 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9xs7m" Feb 25 17:06:00 crc kubenswrapper[4937]: I0225 17:06:00.176913 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 17:06:00 crc kubenswrapper[4937]: I0225 17:06:00.179789 4937 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 17:06:00 crc kubenswrapper[4937]: I0225 17:06:00.190456 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533986-7gq5l"] Feb 25 17:06:00 crc kubenswrapper[4937]: I0225 17:06:00.228556 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw68t\" (UniqueName: \"kubernetes.io/projected/27e6e49a-c998-4506-afeb-6273e12603ef-kube-api-access-rw68t\") pod \"auto-csr-approver-29533986-7gq5l\" (UID: \"27e6e49a-c998-4506-afeb-6273e12603ef\") " pod="openshift-infra/auto-csr-approver-29533986-7gq5l" Feb 25 17:06:00 crc kubenswrapper[4937]: I0225 17:06:00.330544 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw68t\" (UniqueName: \"kubernetes.io/projected/27e6e49a-c998-4506-afeb-6273e12603ef-kube-api-access-rw68t\") pod \"auto-csr-approver-29533986-7gq5l\" (UID: \"27e6e49a-c998-4506-afeb-6273e12603ef\") " pod="openshift-infra/auto-csr-approver-29533986-7gq5l" Feb 25 17:06:00 crc kubenswrapper[4937]: I0225 17:06:00.353739 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw68t\" (UniqueName: \"kubernetes.io/projected/27e6e49a-c998-4506-afeb-6273e12603ef-kube-api-access-rw68t\") pod \"auto-csr-approver-29533986-7gq5l\" (UID: \"27e6e49a-c998-4506-afeb-6273e12603ef\") " pod="openshift-infra/auto-csr-approver-29533986-7gq5l" Feb 25 17:06:00 crc kubenswrapper[4937]: I0225 17:06:00.491909 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533986-7gq5l" Feb 25 17:06:00 crc kubenswrapper[4937]: I0225 17:06:00.953581 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533986-7gq5l"] Feb 25 17:06:01 crc kubenswrapper[4937]: I0225 17:06:01.657516 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533986-7gq5l" event={"ID":"27e6e49a-c998-4506-afeb-6273e12603ef","Type":"ContainerStarted","Data":"af4f111f1d015fb4cd1d94f034ed30ae8b17537d494a501c003dc5c5958888c3"} Feb 25 17:06:03 crc kubenswrapper[4937]: I0225 17:06:03.726837 4937 generic.go:334] "Generic (PLEG): container finished" podID="27e6e49a-c998-4506-afeb-6273e12603ef" containerID="96453f4641f4b5638151b12b535b10769fc6a1bfe42f2073168f41279e600f07" exitCode=0 Feb 25 17:06:03 crc kubenswrapper[4937]: I0225 17:06:03.727379 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533986-7gq5l" event={"ID":"27e6e49a-c998-4506-afeb-6273e12603ef","Type":"ContainerDied","Data":"96453f4641f4b5638151b12b535b10769fc6a1bfe42f2073168f41279e600f07"} Feb 25 17:06:05 crc kubenswrapper[4937]: I0225 17:06:05.380874 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533986-7gq5l" Feb 25 17:06:05 crc kubenswrapper[4937]: I0225 17:06:05.499003 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw68t\" (UniqueName: \"kubernetes.io/projected/27e6e49a-c998-4506-afeb-6273e12603ef-kube-api-access-rw68t\") pod \"27e6e49a-c998-4506-afeb-6273e12603ef\" (UID: \"27e6e49a-c998-4506-afeb-6273e12603ef\") " Feb 25 17:06:05 crc kubenswrapper[4937]: I0225 17:06:05.509147 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27e6e49a-c998-4506-afeb-6273e12603ef-kube-api-access-rw68t" (OuterVolumeSpecName: "kube-api-access-rw68t") pod "27e6e49a-c998-4506-afeb-6273e12603ef" (UID: "27e6e49a-c998-4506-afeb-6273e12603ef"). InnerVolumeSpecName "kube-api-access-rw68t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 17:06:05 crc kubenswrapper[4937]: I0225 17:06:05.604301 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw68t\" (UniqueName: \"kubernetes.io/projected/27e6e49a-c998-4506-afeb-6273e12603ef-kube-api-access-rw68t\") on node \"crc\" DevicePath \"\"" Feb 25 17:06:05 crc kubenswrapper[4937]: I0225 17:06:05.752650 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533986-7gq5l" event={"ID":"27e6e49a-c998-4506-afeb-6273e12603ef","Type":"ContainerDied","Data":"af4f111f1d015fb4cd1d94f034ed30ae8b17537d494a501c003dc5c5958888c3"} Feb 25 17:06:05 crc kubenswrapper[4937]: I0225 17:06:05.752701 4937 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af4f111f1d015fb4cd1d94f034ed30ae8b17537d494a501c003dc5c5958888c3" Feb 25 17:06:05 crc kubenswrapper[4937]: I0225 17:06:05.752771 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533986-7gq5l" Feb 25 17:06:06 crc kubenswrapper[4937]: I0225 17:06:06.507431 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533980-4p6s6"] Feb 25 17:06:06 crc kubenswrapper[4937]: I0225 17:06:06.520146 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533980-4p6s6"] Feb 25 17:06:07 crc kubenswrapper[4937]: I0225 17:06:07.396252 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e87edea4-e15e-4b00-b34e-5097d7206b4e" path="/var/lib/kubelet/pods/e87edea4-e15e-4b00-b34e-5097d7206b4e/volumes" Feb 25 17:06:09 crc kubenswrapper[4937]: I0225 17:06:09.818020 4937 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s7x5t"] Feb 25 17:06:09 crc kubenswrapper[4937]: E0225 17:06:09.819021 4937 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e6e49a-c998-4506-afeb-6273e12603ef" containerName="oc" Feb 25 17:06:09 crc kubenswrapper[4937]: I0225 17:06:09.819041 4937 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e6e49a-c998-4506-afeb-6273e12603ef" containerName="oc" Feb 25 17:06:09 crc kubenswrapper[4937]: I0225 17:06:09.819414 4937 memory_manager.go:354] "RemoveStaleState removing state" podUID="27e6e49a-c998-4506-afeb-6273e12603ef" containerName="oc" Feb 25 17:06:09 crc kubenswrapper[4937]: I0225 17:06:09.822684 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:09 crc kubenswrapper[4937]: I0225 17:06:09.836876 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s7x5t"] Feb 25 17:06:09 crc kubenswrapper[4937]: I0225 17:06:09.905571 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223cbd97-6ef8-4734-9e2a-23140182ee36-utilities\") pod \"redhat-operators-s7x5t\" (UID: \"223cbd97-6ef8-4734-9e2a-23140182ee36\") " pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:09 crc kubenswrapper[4937]: I0225 17:06:09.905661 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223cbd97-6ef8-4734-9e2a-23140182ee36-catalog-content\") pod \"redhat-operators-s7x5t\" (UID: \"223cbd97-6ef8-4734-9e2a-23140182ee36\") " pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:09 crc kubenswrapper[4937]: I0225 17:06:09.906452 4937 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q82gz\" (UniqueName: \"kubernetes.io/projected/223cbd97-6ef8-4734-9e2a-23140182ee36-kube-api-access-q82gz\") pod \"redhat-operators-s7x5t\" (UID: \"223cbd97-6ef8-4734-9e2a-23140182ee36\") " pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:10 crc kubenswrapper[4937]: I0225 17:06:10.009073 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q82gz\" (UniqueName: \"kubernetes.io/projected/223cbd97-6ef8-4734-9e2a-23140182ee36-kube-api-access-q82gz\") pod \"redhat-operators-s7x5t\" (UID: \"223cbd97-6ef8-4734-9e2a-23140182ee36\") " pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:10 crc kubenswrapper[4937]: I0225 17:06:10.009194 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223cbd97-6ef8-4734-9e2a-23140182ee36-utilities\") pod \"redhat-operators-s7x5t\" (UID: \"223cbd97-6ef8-4734-9e2a-23140182ee36\") " pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:10 crc kubenswrapper[4937]: I0225 17:06:10.009223 4937 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223cbd97-6ef8-4734-9e2a-23140182ee36-catalog-content\") pod \"redhat-operators-s7x5t\" (UID: \"223cbd97-6ef8-4734-9e2a-23140182ee36\") " pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:10 crc kubenswrapper[4937]: I0225 17:06:10.009784 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223cbd97-6ef8-4734-9e2a-23140182ee36-catalog-content\") pod \"redhat-operators-s7x5t\" (UID: \"223cbd97-6ef8-4734-9e2a-23140182ee36\") " pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:10 crc kubenswrapper[4937]: I0225 17:06:10.010336 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223cbd97-6ef8-4734-9e2a-23140182ee36-utilities\") pod \"redhat-operators-s7x5t\" (UID: \"223cbd97-6ef8-4734-9e2a-23140182ee36\") " pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:10 crc kubenswrapper[4937]: I0225 17:06:10.027724 4937 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q82gz\" (UniqueName: \"kubernetes.io/projected/223cbd97-6ef8-4734-9e2a-23140182ee36-kube-api-access-q82gz\") pod \"redhat-operators-s7x5t\" (UID: \"223cbd97-6ef8-4734-9e2a-23140182ee36\") " pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:10 crc kubenswrapper[4937]: I0225 17:06:10.155864 4937 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:11 crc kubenswrapper[4937]: I0225 17:06:11.146036 4937 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s7x5t"] Feb 25 17:06:11 crc kubenswrapper[4937]: I0225 17:06:11.823875 4937 generic.go:334] "Generic (PLEG): container finished" podID="223cbd97-6ef8-4734-9e2a-23140182ee36" containerID="5a1fea99170d6a132ae1bef3d6dbff17928f7d16c208e15d7a09d0d89977be81" exitCode=0 Feb 25 17:06:11 crc kubenswrapper[4937]: I0225 17:06:11.823943 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7x5t" event={"ID":"223cbd97-6ef8-4734-9e2a-23140182ee36","Type":"ContainerDied","Data":"5a1fea99170d6a132ae1bef3d6dbff17928f7d16c208e15d7a09d0d89977be81"} Feb 25 17:06:11 crc kubenswrapper[4937]: I0225 17:06:11.824417 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7x5t" event={"ID":"223cbd97-6ef8-4734-9e2a-23140182ee36","Type":"ContainerStarted","Data":"3e1e0a8c5d25deb51cc82dbeddbd6caa1a6d024cb69a79e57ad558dee36b23ed"} Feb 25 17:06:12 crc kubenswrapper[4937]: I0225 17:06:12.834692 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7x5t" event={"ID":"223cbd97-6ef8-4734-9e2a-23140182ee36","Type":"ContainerStarted","Data":"1fb76d31e3d4f4cb5a7bb2046b18416c2d9fcfd5f3c2660da7841ede29124274"} Feb 25 17:06:16 crc kubenswrapper[4937]: I0225 17:06:16.889668 4937 generic.go:334] "Generic (PLEG): container finished" podID="223cbd97-6ef8-4734-9e2a-23140182ee36" containerID="1fb76d31e3d4f4cb5a7bb2046b18416c2d9fcfd5f3c2660da7841ede29124274" exitCode=0 Feb 25 17:06:16 crc kubenswrapper[4937]: I0225 17:06:16.889790 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7x5t" event={"ID":"223cbd97-6ef8-4734-9e2a-23140182ee36","Type":"ContainerDied","Data":"1fb76d31e3d4f4cb5a7bb2046b18416c2d9fcfd5f3c2660da7841ede29124274"} Feb 25 17:06:17 crc kubenswrapper[4937]: I0225 17:06:17.904951 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7x5t" event={"ID":"223cbd97-6ef8-4734-9e2a-23140182ee36","Type":"ContainerStarted","Data":"adbd785a9648c5e160488ccd2ec8d8aecc64e359be3403efcccfe1ca78858ce6"} Feb 25 17:06:17 crc kubenswrapper[4937]: I0225 17:06:17.942954 4937 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s7x5t" podStartSLOduration=3.155729762 podStartE2EDuration="8.942934396s" podCreationTimestamp="2026-02-25 17:06:09 +0000 UTC" firstStartedPulling="2026-02-25 17:06:11.826215205 +0000 UTC m=+4822.839607095" lastFinishedPulling="2026-02-25 17:06:17.613419839 +0000 UTC m=+4828.626811729" observedRunningTime="2026-02-25 17:06:17.933400537 +0000 UTC m=+4828.946792427" watchObservedRunningTime="2026-02-25 17:06:17.942934396 +0000 UTC m=+4828.956326276" Feb 25 17:06:20 crc kubenswrapper[4937]: I0225 17:06:20.157013 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:20 crc kubenswrapper[4937]: I0225 17:06:20.159714 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:21 crc kubenswrapper[4937]: I0225 17:06:21.224559 4937 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s7x5t" podUID="223cbd97-6ef8-4734-9e2a-23140182ee36" containerName="registry-server" probeResult="failure" output=< Feb 25 17:06:21 crc kubenswrapper[4937]: timeout: failed to connect service ":50051" within 1s Feb 25 17:06:21 crc kubenswrapper[4937]: > Feb 25 17:06:30 crc kubenswrapper[4937]: I0225 17:06:30.223970 4937 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:30 crc kubenswrapper[4937]: I0225 17:06:30.277281 4937 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:30 crc kubenswrapper[4937]: I0225 17:06:30.982130 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s7x5t"] Feb 25 17:06:32 crc kubenswrapper[4937]: I0225 17:06:32.049753 4937 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s7x5t" podUID="223cbd97-6ef8-4734-9e2a-23140182ee36" containerName="registry-server" containerID="cri-o://adbd785a9648c5e160488ccd2ec8d8aecc64e359be3403efcccfe1ca78858ce6" gracePeriod=2 Feb 25 17:06:32 crc kubenswrapper[4937]: I0225 17:06:32.819867 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:32 crc kubenswrapper[4937]: I0225 17:06:32.929537 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223cbd97-6ef8-4734-9e2a-23140182ee36-catalog-content\") pod \"223cbd97-6ef8-4734-9e2a-23140182ee36\" (UID: \"223cbd97-6ef8-4734-9e2a-23140182ee36\") " Feb 25 17:06:32 crc kubenswrapper[4937]: I0225 17:06:32.929688 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223cbd97-6ef8-4734-9e2a-23140182ee36-utilities\") pod \"223cbd97-6ef8-4734-9e2a-23140182ee36\" (UID: \"223cbd97-6ef8-4734-9e2a-23140182ee36\") " Feb 25 17:06:32 crc kubenswrapper[4937]: I0225 17:06:32.929812 4937 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q82gz\" (UniqueName: \"kubernetes.io/projected/223cbd97-6ef8-4734-9e2a-23140182ee36-kube-api-access-q82gz\") pod \"223cbd97-6ef8-4734-9e2a-23140182ee36\" (UID: \"223cbd97-6ef8-4734-9e2a-23140182ee36\") " Feb 25 17:06:32 crc kubenswrapper[4937]: I0225 17:06:32.930540 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/223cbd97-6ef8-4734-9e2a-23140182ee36-utilities" (OuterVolumeSpecName: "utilities") pod "223cbd97-6ef8-4734-9e2a-23140182ee36" (UID: "223cbd97-6ef8-4734-9e2a-23140182ee36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 17:06:32 crc kubenswrapper[4937]: I0225 17:06:32.936748 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/223cbd97-6ef8-4734-9e2a-23140182ee36-kube-api-access-q82gz" (OuterVolumeSpecName: "kube-api-access-q82gz") pod "223cbd97-6ef8-4734-9e2a-23140182ee36" (UID: "223cbd97-6ef8-4734-9e2a-23140182ee36"). InnerVolumeSpecName "kube-api-access-q82gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.033150 4937 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223cbd97-6ef8-4734-9e2a-23140182ee36-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.033221 4937 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q82gz\" (UniqueName: \"kubernetes.io/projected/223cbd97-6ef8-4734-9e2a-23140182ee36-kube-api-access-q82gz\") on node \"crc\" DevicePath \"\"" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.065319 4937 generic.go:334] "Generic (PLEG): container finished" podID="223cbd97-6ef8-4734-9e2a-23140182ee36" containerID="adbd785a9648c5e160488ccd2ec8d8aecc64e359be3403efcccfe1ca78858ce6" exitCode=0 Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.065400 4937 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s7x5t" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.065402 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7x5t" event={"ID":"223cbd97-6ef8-4734-9e2a-23140182ee36","Type":"ContainerDied","Data":"adbd785a9648c5e160488ccd2ec8d8aecc64e359be3403efcccfe1ca78858ce6"} Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.065568 4937 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7x5t" event={"ID":"223cbd97-6ef8-4734-9e2a-23140182ee36","Type":"ContainerDied","Data":"3e1e0a8c5d25deb51cc82dbeddbd6caa1a6d024cb69a79e57ad558dee36b23ed"} Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.065616 4937 scope.go:117] "RemoveContainer" containerID="adbd785a9648c5e160488ccd2ec8d8aecc64e359be3403efcccfe1ca78858ce6" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.073337 4937 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/223cbd97-6ef8-4734-9e2a-23140182ee36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "223cbd97-6ef8-4734-9e2a-23140182ee36" (UID: "223cbd97-6ef8-4734-9e2a-23140182ee36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.094514 4937 scope.go:117] "RemoveContainer" containerID="1fb76d31e3d4f4cb5a7bb2046b18416c2d9fcfd5f3c2660da7841ede29124274" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.132138 4937 scope.go:117] "RemoveContainer" containerID="5a1fea99170d6a132ae1bef3d6dbff17928f7d16c208e15d7a09d0d89977be81" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.135156 4937 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223cbd97-6ef8-4734-9e2a-23140182ee36-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.221324 4937 scope.go:117] "RemoveContainer" containerID="adbd785a9648c5e160488ccd2ec8d8aecc64e359be3403efcccfe1ca78858ce6" Feb 25 17:06:33 crc kubenswrapper[4937]: E0225 17:06:33.223034 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adbd785a9648c5e160488ccd2ec8d8aecc64e359be3403efcccfe1ca78858ce6\": container with ID starting with adbd785a9648c5e160488ccd2ec8d8aecc64e359be3403efcccfe1ca78858ce6 not found: ID does not exist" containerID="adbd785a9648c5e160488ccd2ec8d8aecc64e359be3403efcccfe1ca78858ce6" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.223092 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adbd785a9648c5e160488ccd2ec8d8aecc64e359be3403efcccfe1ca78858ce6"} err="failed to get container status \"adbd785a9648c5e160488ccd2ec8d8aecc64e359be3403efcccfe1ca78858ce6\": rpc error: code = NotFound desc = could not find container \"adbd785a9648c5e160488ccd2ec8d8aecc64e359be3403efcccfe1ca78858ce6\": container with ID starting with adbd785a9648c5e160488ccd2ec8d8aecc64e359be3403efcccfe1ca78858ce6 not found: ID does not exist" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.223121 4937 scope.go:117] "RemoveContainer" containerID="1fb76d31e3d4f4cb5a7bb2046b18416c2d9fcfd5f3c2660da7841ede29124274" Feb 25 17:06:33 crc kubenswrapper[4937]: E0225 17:06:33.224130 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fb76d31e3d4f4cb5a7bb2046b18416c2d9fcfd5f3c2660da7841ede29124274\": container with ID starting with 1fb76d31e3d4f4cb5a7bb2046b18416c2d9fcfd5f3c2660da7841ede29124274 not found: ID does not exist" containerID="1fb76d31e3d4f4cb5a7bb2046b18416c2d9fcfd5f3c2660da7841ede29124274" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.224164 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fb76d31e3d4f4cb5a7bb2046b18416c2d9fcfd5f3c2660da7841ede29124274"} err="failed to get container status \"1fb76d31e3d4f4cb5a7bb2046b18416c2d9fcfd5f3c2660da7841ede29124274\": rpc error: code = NotFound desc = could not find container \"1fb76d31e3d4f4cb5a7bb2046b18416c2d9fcfd5f3c2660da7841ede29124274\": container with ID starting with 1fb76d31e3d4f4cb5a7bb2046b18416c2d9fcfd5f3c2660da7841ede29124274 not found: ID does not exist" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.224185 4937 scope.go:117] "RemoveContainer" containerID="5a1fea99170d6a132ae1bef3d6dbff17928f7d16c208e15d7a09d0d89977be81" Feb 25 17:06:33 crc kubenswrapper[4937]: E0225 17:06:33.224653 4937 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a1fea99170d6a132ae1bef3d6dbff17928f7d16c208e15d7a09d0d89977be81\": container with ID starting with 5a1fea99170d6a132ae1bef3d6dbff17928f7d16c208e15d7a09d0d89977be81 not found: ID does not exist" containerID="5a1fea99170d6a132ae1bef3d6dbff17928f7d16c208e15d7a09d0d89977be81" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.224691 4937 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a1fea99170d6a132ae1bef3d6dbff17928f7d16c208e15d7a09d0d89977be81"} err="failed to get container status \"5a1fea99170d6a132ae1bef3d6dbff17928f7d16c208e15d7a09d0d89977be81\": rpc error: code = NotFound desc = could not find container \"5a1fea99170d6a132ae1bef3d6dbff17928f7d16c208e15d7a09d0d89977be81\": container with ID starting with 5a1fea99170d6a132ae1bef3d6dbff17928f7d16c208e15d7a09d0d89977be81 not found: ID does not exist" Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.420545 4937 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s7x5t"] Feb 25 17:06:33 crc kubenswrapper[4937]: I0225 17:06:33.434352 4937 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s7x5t"] Feb 25 17:06:35 crc kubenswrapper[4937]: I0225 17:06:35.384184 4937 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="223cbd97-6ef8-4734-9e2a-23140182ee36" path="/var/lib/kubelet/pods/223cbd97-6ef8-4734-9e2a-23140182ee36/volumes" Feb 25 17:06:44 crc kubenswrapper[4937]: I0225 17:06:44.899640 4937 scope.go:117] "RemoveContainer" containerID="afab612230d80486f74fe83ca0f6a3974a64f1110f38908bd02cc943639c9b3f"